Field development optimization is a crucial and important problem considering economic and physical constraints. Advanced technologies such as smart wells and completions, and well performance optimization, allow for maximizing recovery without the need of additional drilling activities and all associated costs. This type of strategy applies to mature and challenging heterogeneous reservoirs with water and gas coning, and to low permeability and gas condensate deposits, and others. The presented method elegantly considers all field constraints and delivers predictive optimal well control settings taking into consideration reservoir uncertainties and offers optimized development strategies while minimizing on risk and cost.
We will present a new field development technology to optimize well performance in both oil and gas fields. The methodology allows for simultaneous optimization under uncertainty of multiple wells using surface constraints and subsurface flow control devices. The method is very fast and uses a combination of gradient and stochastic methods. Starting from an ensemble of realizations of the reservoir model, the best well control settings are searched using a steepest ascent gradient search method, where the search direction can be approximated using the cross covariance of the ensemble of objectives predicted using cloned reservoir simulations for a user defined future time horizon. The workflow solution employs several levels of computational complexity reduction combining advanced hardware architecture and localization algorithms to speedup optimization procedures and cover wider range of spatial uncertainty.
We present two examples that include control of surface liquid rates of producers and injection rates of injectors as well as controlling the flow control valves of producer and injector wells to demonstrate both efficiency and robustness of the optimization method. The method presented is very robust and offers with the highest confidence optimal field developments making best use of the most advanced software and hardware technology. The novel presented process results in significant cost reduction on drilling and other operational activities.
Characterizing the fractures is an important task to improve the understanding and utilization of hydraulic fracturing. As an approach to augment and improve on the existing methods, time-lapse electric potential measurements could be used to characterize subsurface features. In this study we investigated the characterization of fracture length and fracture density by using time-lapse electric potential data. A new borehole ERT (electric resistivity tomography) method designed specifically for hydraulic fracture characterization is proposed to better capture reservoir dynamics during hydraulic fracturing. This method uses high resolution electric potential data by implementing electrodes in or near boreholes and monitor electric potential distribution near the horizontal fracture zone. The time-lapse electric potential data generated by this tool were simulated and subsequently used to analyze fracture characteristics. Inverse analysis was then performed on the electric potential data to estimate fracture length and fracture density. Last, we performed sensitivity analysis to examine the robustness of the estimates in nonideal environments. The results of this work show that time-lapse electric potential data are capable of capturing flow dynamics during the fracturing process. Using the proposed borehole ERT method we successfully estimated the true fracture length and true fracture density of a constructed fracture model. We were able to determine the best locations in the constructed reservoir to place the electrodes, and through sensitivity analysis we found the maximum noise level of the electric potential data that can still allow the proposed method to make robust fracture length and fracture density estimates.
Our proposed method offers a new approach to make robust estimates of fracture length and fracture density. Electric potential data have been used mostly for well logging in the past. This study demonstrates a novel way of using electric potential data in unconventional development and opens possibilities for more applications such as production monitoring.
In the petroleum industry, references to a new science called Operations Research appeared in the literature in the late 1950s; however, over the last 60 years, use of Operations Research optimization techniques in the petroleum industry has been sporadic, resulting in vast untapped optimization opportunities. Some subsets of the petroleum industry have successfully used Operations Research optimization techniques for specific applications; however, when these techniques were not used by the industry, the stated reasons involved difficulty dealing with non-linearities and stochastic elements, insufficient computational power to solve realistic models, and the necessity for specialized knowledge of Operations Research optimization software and solvers. In this paper we provide a history of the use of Operations Research optimization methods in the petroleum industry by presenting a comprehensive review of papers that use these techniques. We focus on linear, nonlinear, integer, and mixed-integer optimization methods and the evolution of these models in the petroleum industry over time.
Zalavadia, Hardikkumar (Texas A&M University) | Sankaran, Sathish (Anadarko Petroleum Corporation) | Kara, Mustafa (Anadarko Petroleum Corporation) | Sun, Wenyue (Anadarko Petroleum Corporation) | Gildin, Eduardo (Texas A&M University)
Model-based field development planning and optimization often require computationally intensive reservoir simulations, where the models need to be run several times in the context of input uncertainty or seeking optimal results. Reduced Order Modeling (ROM) methods are a class of techniques that are applied to reservoir simulation to reduce model complexity and speed up computations, especially for large scale or complex models that may be quite useful for such optimization problems. While intrusive ROM methods (such as proper orthogonal decomposition (POD) and its extensions, trajectory piece-wise linearization (TPWL), Discrete Empirical Interpolation Method (DEIM) etc.) have been proposed for application to reservoir simulation problems, these remain inaccessible or unusable for a large number of practical applications that use commercial simulators.
In this paper, we describe a novel application of a non-intrusive ROM method, namely dynamic mode decomposition (DMD). We specifically look at reducing the time complexity involved in well control optimization problem, using a variant of DMD called DMDc (DMD with control). We propose a workflow using a training dataset of the wells and predict the state solution (pressure and saturation) for a new set of bottomhole pressure profiles encountered during the optimization runs. We use a novel strategy to select the basis dimensions to prevent unstable solutions. Since the objective function of the optimization problem is usually based on fluid production profiles, we propose a strategy to predict the fluid production rates from the predicted states from DMDc using machine learning techniques. The features for this machine learning problem are designed based on the physics of fluid flow through well perforations, which result in very accurate rate predictions. We compare the proposed methodology using another variant of DMD called ioDMD (input-ouput DMD) for system identification to predict output production flow rates.
The methodology is demonstrated on a benchmark case and a Gulf of Mexico deepwater field that shows significant time reduction in production control optimization problem with about 30 – 40 times speedup using the proposed DMDc workflow as compared to fine scale simulations, while preserving the accuracy of the solutions. The proposed "non-intrusive" method in this paper to reduce model complexity can substantially increase the range of application of ROM methods for practical field development and reservoir management.
You, Junyu (Petroleum Recovery Research Center) | Ampomah, William (Petroleum Recovery Research Center) | Sun, Qian (Petroleum Recovery Research Center) | Kutsienyo, Eusebius Junior (Petroleum Recovery Research Center) | Balch, Robert Scott (Petroleum Recovery Research Center) | Cather, Martha (Petroleum Recovery Research Center)
In this paper, a hybrid scheme that couples artificial neural network (ANN) and multi-objective optimizers is structured to co-optimize oil recovery and carbon storage of CO2 - EOR processes. The workflow is developed and validated employing an injection-pattern-based model. A field scale case study is presented to demonstrate the practicability of the workflow.
An injection-pattern based reservoir model employing a compositional numerical simulator is established to develop and test the hybrid-optimization workflow. Such a scheme aims at optimizing objective functions including oil recovery factor, CO2 storage and project net present value (NPV). An ANN expert system is trained and employed as a proxy of the high-fidelity model in the optimization process. The ANN model is trained by a robust optimization procedure which is competent to find the best architecture. Particle swarm optimization (PSO) is coupled with the developed proxy model to optimize a weight-aggregated objective function, and multi-objective functions by a Pareto front approach. A field case study is included in this paper. The reservoir model is well-tuned via a rigorous history matching process using the available field data. The aforementioned workflow is deployed to optimize the tertiary recovery stage of the field development.
In this paper, the validation results of the proxy model will be compared against results from the high-fidelity numerical models. Investigations focus on comparing the optimum solution found by the aggregative objective function and the solution repository (Pareto front) generated by the multi-objective optimization process. The optimization results provide significant insight to the decision-making process of CO2 - EOR project when multiple objective functions are considered.
This study develops a novel hybrid-optimization workflow for CO2 - EOR projects considering multiple objective functions. The robustness of the development is confirmed via a field case study. Moreover, this work investigates the relationship between the solutions of the aggregative objective function and the Pareto front, which provides constraints and reduces uncertainties involved by the multi-objective optimization process.
We develop a novel ensemble model-maturation method that is based on the Randomized Maximum Likelihood (RML) technique and adjoint-based computation of objective function gradients. The new approach is especially relevant for rich data sets with time-lapse information content. The inversion method that solves the model-maturation problem takes advantage of the adjoint-based computation of objective function gradients for a very large number of model parameters at the cost of a forward and a backward (adjoint) simulation. The inversion algorithm calibrates model parameters to arbitrary types of production data including time-lapse reservoir-pressure traces by use of a weighted and regularized objective function. We have also developed a new and effective multigrid preconditioning protocol for accelerated iterative linear solutions of the adjoint-simulation step for models with multiple levels of local grid refinement. The protocol is based on a geometric multigrid (GMG) preconditioning technique. Within the model-maturation workflow, a machine-learning technique is applied to establish links between the mesh-based inversion results (e.g., permeability-multiplier fields) and geologic modeling parameters inside a static model (e.g., object dimensions, etc.). Our workflow integrates the learnings from inversion back into the static model, and thereby, ensures the geologic consistency of the static model while improving the quality of ensuing dynamic model in terms of honoring production and time-lapse data, and reducing forecast uncertainty. This use of machine learning to post-process the model-maturation outcome effectively converts the conventional continuous-parameter history-matching result into a discrete tomographic inversion result constrained to geological rules encoded in training images.
We demonstrate the practical utilization of the adjoint-based model-maturation method on a large time-lapse reservoir-pressure data set using an ensemble of full-field models from a reservoir case study. The model-maturation technique effectively identifies the permeability modification zones that are consistent with alternative geological interpretations and proposes updates to the static model. Upon these updates, the model not only agrees better with the time-lapse reservoir-pressure data but also better honors the tubing-head pressure as well as production logging data. We also provide computational performance indicators that demonstrate the accelerated convergence characteristics of the new iterative linear solver for adjoint equations.
Reservoir simulation optimization under uncertainty typically invokes a sense of anxiety mainly because of a lack of a systematic criterion to choose between different development scenarios under uncertainty, how to go about doing well placement and optimizing well controls in the face of a large uncertainty ensemble of static realisations, and most of all the large number of simulation runs that potentially needs to be conducted. This is exacerbated when the models are large and require many hours to run. Moreover, even with the prevalence of distributed and parallel computing clusters, there is still a limited amount of computing resources available when spread out over the number of reservoir engineers within a company. Time and budget constraints also contribute to complicating this process. Furthermore, with the requirement of an inordinately large number of simulation runs comes the dilemma as to which optimizer to choose that would help speed up the process.
This paper first starts off with a brief background into historical attempts at tackling this problem by delving into the literature. Then it discusses a rigorous criterion for optimization under uncertainty viz. stochastic dominance, hitherto little known or used in the industry. A commonly used greenfield case study which is an ensemble set of uncertainty realisations is then introduced, which the rest of the paper will be based on. The ensemble is a pre-generated set of fifty realisations designed specifically for this problem. Two challenging areas will then be addressed viz. well placement optmisation under uncertainty, and well controls optimization under uncertainty.
Finally, a comparison between the simplex, proxy response surface, differential evolution and particle swarm optimization methods is made in the optimization of well controls. Hence the paper aims to give a complete picture on how to go about reservoir simulation optimization under uncertainty, with a drastically reduced amount of computational runs that needs to be conducted. Practical and sensible formulation of the optimization problemcan go a long way to making this process more understandable and easier to implement.
Improved reservoir knowledge is key to extracting additional value from existing oil and gas assets. However, given the uncertainty in the subsurface, it is always a question if our current development strategy is the most robust choice, or if there are alternatives that can further increase the value of our field. This paper presents a novel solution that enables the asset team to answer these questions in a new way. Furthermore, the solution helps teams quickly identify and screen new opportunities that ultimately increase both subsurface understanding and the value of the field. The solution combines a quasi- Newton gradient based numerical optimization scheme with a stochastic simplex approximate gradient (StoSAG) algorithm. Because the algorithm is non-intrusive with respect to the fluid flow simulator, we can directly apply the solution on any flow optimization problem without the need to access the simulator source code. The solution is implemented using a microservice architecture that allows for efficient scaling and deployment either on cloud-based or internal systems. We demonstrate the proposed solution on a field containing 11 oil producers and 7 water injectors by optimizing the water injection and oil production rates. The machine learning algorithm allows us to quickly explore different drainage strategies, given the current understanding and associated uncertainties of the reservoir.
Specifically, the software solution suggests that 6 of the 18 pre-defined well targets are high risk and/or of little value. Running a second development scenario where we do not drill these six wells reduces the investment cost of this field by 163 MUSD and increases the expected net present value per well of the field by 48 percent. Compared with the reactive control drainage strategy approach, we increase the expected net present value of the field by 9.0 %, while simultaneously lowering the associated risk.
Al-Jenaibi, Faisal (ADNOC - Upstream) | Shelepov, Konstantin (Rock Flow Dynamics) | Kuzevanov, Maksim (Rock Flow Dynamics) | Gusarov, Evgenii (Rock Flow Dynamics) | Bogachev, Kirill (Rock Flow Dynamics)
The application of intelligent algorithms that use clever simplifications and methods to solve computationallycomplex problems are rapidly displacing traditional methods in the petroleum industry. The latest forward-thinking approaches inhistory matching and uncertainty quantification were applied on a dynamic model that has unknown permeability model. The original perm-poro profile was constructed based on synthetic data to compare Assisted History Matching (AHM)approach to the exact solution. It is assumed that relative permeabilities, endpoints, or any parameter other than absolute permeability to match oil/water/gas rates, gas-oil ratio, water injection rate, watercut and bottomhole pressure cannot be modified.
The standard approach is to match a model via permeability variation is to split the grid into several regions. However, this process is a complete guess as it is unclear in advance how to select regions. The geological prerequisites for such splitting usually do not exist. Moreover, the values of permeability and porosity in different grid blocks are correlated. Independent change of these values for each region distortscorrelations or make the model unphysical.
The proposed alternative involves the decomposition of permeability model into spectrum amplitudes using Discrete Cosine Transformation (DCT), which is a form of Fourier Transform. The sum of all amplitudes in DCT is equal to the original property distribution. Uncertain permeability model typically involves subjective judgment, and several optimization runs to construct uncertainty matrix. However, the proposed multi-objective Particle Swarm Optimization (PSO) helps to reduce randomness and find optimal undominated by any other objective solution with fewer runs. Further optimization of Flexi-PSO algorithm is performed on its constituting components such as swarm size, inertia, nostalgia, sociality, damping factor, neighbor count, neighborliness, the proportion of explorers, egoism, community and relative critical distance to increase the speed of convergence. Additionally, the clustering technique, such as Principal Component Analysis (PCA), is suggested as a mean to reduce the space dimensionality of resulted solutions while ensuring the diversity of selected cluster centers.
The presentedset of methodshelps to achieve a qualitative and quantitative match with respect to any property, reduce the number of uncertainty parameters, setup ageneric and efficient approach towards assisted history matching.
Standard approaches to optimization under uncertainty in reservoir simulation require use of multiple realizations, with variable parameters representing operational constraints and actions as well as uncertain scenarios. We will show how appropriate use of local optimization within the simulation model, using customized logic for field management strategies, can bring improved workflow flexibility and efficiency, by reducing the effort needed for uncertainty iterations.
To achieve meaningful forecasts for an ensemble of uncertain scenarios, it is important to distinguish between different types of decision. Investment decisions, such as facilities sizing, depend on global unknowns and must be optimized for the complete ensemble. Operational actions, such as closing a valve, can be optimized instantaneously for individual scenarios, using measurable information, although subject to constraints determined at a global level. In this study, we implement local optimization procedures within simulation cases, combining customized objective criteria to rank reactive or proactive actions, with the ability to query reservoir flow entities at appropriate frequencies.
The methods presented in the paper can be used for reactive response modeling for smart downhole control; optimization of ESP/PCP pump performance; and implementation of production plans subject to defined downstream limits. For selected cases, we compare the advantages and disadvantages of the local optimization approach with standardized "big-loop" uncertainty workflows. The methodology can significantly reduce optimization costs, particularly for high-frequency actions, achieving similar objective function values in a fraction of the time needed for post-processing optimizers. Use of tailored scripting provides the capability to modernize the logic framework for field management decisions, with realistic representation of smart field equipment and flow entities at any level of complexity.
Use of efficient workflows as described in this paper can reduce the cost of multiple realization studies significantly, or enable engineers to consider a wider range of possible scenarios, for deeper understanding and better risk mitigation.