Kumar, Anjani (Computer Modelling Group Ltd) | Novlesky, Alex (Computer Modelling Group Ltd) | Bityutsky, Erykah (Computer Modelling Group Ltd) | Koci, Paul (Consultant for Occidental Petroleum Corporation) | Wightman, Jeff (Occidental Petroleum Corporation)
Heavy oil reservoirs often require thermal enhanced oil recovery (EOR) processes to improve the mobility of the highly viscous oil. When working with steam flooding operations, finding the optimal steam injection rates is very important given the high cost of steam generation and the current low oil price environment. Steam injection and allocation then becomes an exercise of optimizing cost, improving productivity and net present value (NPV). As the field matures, producers are faced with declining oil rates and increasing steam oil ratios (SOR). Operators must work to reduce injection rates on declining groups of wells to maintain a low SOR and free up capacity for newer, more productive groups of wells. Operators also need a strong surveillance program to monitor field operational parameters like SOR, remaining Oil-in-Place (OIP) distribution in the reservoir, steam breakthrough in the producers, temperature surveys in observation wells etc. Using the surveillance data in conjunction with reservoir simulation, operators must determine a go-forward operating strategy for the steam injection process.
The proposed steam flood optimization workflow incorporates field surveillance data and numerical simulation, driven by machine learning and AI enabled Algorithms, to predict future steam flood reservoir performance and maximize NPV for the reservoir. The process intelligently determines an optimal current field level and well level injection rates, how long to inject at that rate, how fast to reduce rates on mature wells so that it can be reallocated to newly developed regions of the field. A case study has been performed on a subsection of a Middle Eastern reservoir containing eight vertical injectors and four sets of horizontal producers with laterals landed in multiple reservoir zones. Following just the steam reallocation optimization process, NPV for the section improved by 42.4% with corresponding decrease in cumulative SOR by 24%. However, if workover and alternate wellbore design is considered in the optimization process, the NPV for the section has the potential to be improved by 94.7% with a corresponding decrease in cumulative SOR by 32%. This workflow can be extended and applied to a full field steam injection project.
Steam injection rate through life cycle optimization (e.g., the constant rate for a long period of time) could lead to the sub-optimal performance of a thermal heavy oil recovery process. On the other hand, finding the optimal steam injection strategy (policy) represents a major challenge due to the complex dynamic of the physical phenomenon, i.e., nonlinear, slow, high order, time-varying, and potentially highly heterogeneous reservoirs. To address this challenge, the problem can be formulated as an optimal control problem that has typically been solved using adjoint state optimization and a model-predictive control (MPC) strategy.
In contrast, this work presents a reinforcement learning (RL) approach in which the mathematical model of the dynamic process (SAGD) is assumed unknown. An agent is trained to find the optimal policy only through continuous interactions with the environment (e.g., numerical reservoir simulation model). At each time step, the agent executes an action (e.g., increase steam injection rate), receives a reward (e.g., net present value) and observes the new state (e.g., pressure distribution) of the environment. During this interaction, an action-value function is approximated; this function will offer for a given state of the environment the action that will maximize total future reward. This process continues for multiple simulations (episodes) of the dynamic process until convergence is achieved.
In this implementation, the state-action-reward-state-action (SARSA) online policy learning algorithm is employed in which the action-value function is continually estimated after every time step and further used to choose the optimal action. The environment consists of a reservoir simulation model built using data from a reservoir located in northern Alberta. The model consists of one well pair (one injector and one producer) and production horizon of 250 days (one episode) is considered. The state of the environment is defined as cumulative, oil and water production, and water injection and for each time step; three possible actions are considered, i.e., increase, decrease or no change of current steam injection rate; and the reward represents the net present value (NPV). Additionally, stochastic gradient descent is used to approximate the action-value function.
Results show that the optimal steam injection policy obtained using RL implementation improves NPV by at least 30% with more than 60% lower computation cost.
Moussa, Tamer (King Fahd University of Petroleum and Minerals) | Mahmoud, Mohamed (King Fahd University of Petroleum and Minerals) | Patil, Shirish (King Fahd University of Petroleum and Minerals) | Elkatatny, Salaheldin (King Fahd University of Petroleum and Minerals) | Abdelgawad, Khaled (King Fahd University of Petroleum and Minerals)
Thermal recovery methods are viable and commonly used to recover heavy oil reservoirs by reducing oil viscosity and improving oil displacement. However, there are many challenges associated with conventional steam injection methods. These challenges include the significant heat energy losses before steam reaches the reservoir, the high cost of steam generation and injection, as well as the emission of greenhouse gases. Therefore, it is essential to introduce a heavy oil recovery approach in which steam can be generated downhole to overcome these challenges associated with conventional steam injection methods. However, this novel heavy-oil recovery method has several designs and operational parameters that must be efficiently optimized, to achieve maximum recovery from heavy-oil reservoirs with less cost and minimum environmental impact.
The objective of this work is to introduce a novel heavy-oil recovery technique using in-situ steam generated by downhole thermochemical reactions and investigate the key design and operational parameters of this complex recovery process. Modified self-adaptive differential evolution (MSaDE) and particle swarm optimization (PSO) methods are used in this work as global optimizer to find the optimum design and operation parameters to achieve the maximum net present value (NPV) and highest oil recovery (RF) of a heavy-oil reservoir after ten years of development. Comparison of the two proposed optimization methods is introduced as well.
The results show that downhole thermochemical reactions can be used to generate in-situ steam, to efficiently reduce the heavy-oil viscosity and improve oil mobility. It has also shown that utilizing MSaDE and PSO methods to optimize the key components of this novel recovery process, significantly enhanced the recovery performance, in terms of higher NPV and RF.
This study provides the first known in-depth optimization and uncertainty analysis to outline the significance of each design and operation parameter of the proposed novel thermochemical recovery process. This work showed and verified the concept of using downhole thermochemical reactions as an environmental-friendly solution to recover oil from heavy-oil reservoirs and is considered as a step forward to eliminate the greenhouse gases emission related to thermal recovery methods.
This paper presents a practical method for benchmarking heavy oil fields as a tool for identification of opportunities for total cost and production optimization. The method combines actual data from typical heavy oil fields to define reservoir, well and surface complexity indices, for categorizing a subject field and a total cost breakdown model to map potential risks that could cause total cost to increase, potential project/ process delay and poor production performance. The benchmarking process consists of four steps: 1) classification of a subject field using Front End Loading (FEL) and complexity indices that account for: a) reservoir structural, stratigraphic, rock, fluid, energy, static and dynamic complexity, b) well complexity and c) surface processes complexity; 2) selection of analog fields within the range of indices; 3) use of causal maps to identify causes of uncertainty and risks that impact capital expenditures (CAPEX), operational expenditures (OPEX), production losses and cycle time; and 4) a total cost stochastic model is used to generate graphs providing the position of the subject field vs. analogs. A total undiscounted cost breakdown structure provided information on the most critical cost drivers, where significant impact corresponded to OPEX. Causal maps described typical total cost drivers for surface and subsurface.
Temizel, Cenk (Aera Energy) | Irani, Mazda (Ashaw Energy) | Canbaz, Celal Hakan (Schlumberger) | Palabiyik, Yildiray (Istanbul Technical University) | Moreno, Raul (CSmart Recovery) | Diaz, Jose M. (VCG O&G Consultants) | Tao, Tao (Texas Southern University) | Alkouh, Ahmad (College of Technological Studies)
Along with the advances in technology, greener technologies that help to minimize carbon footprints are becoming more common in oilfield applications as well as other areas. Electrical heating is one of the relatively more environmentally-friendly heavy oil recovery technologies that is not new but has gained more popularity with the advances in electrical heating equipment and the technologies within the last decade offering longer and reliable operations that led to its use as a standalone recovery method rather than only a preheating method. In this study, a comprehensive investigation of the production optimization is outlined that includes not only the reservoir aspects but also the production and facility aspects of electrical heating in heavy oil reservoirs. A full-physics commercial simulator has been coupled with an optimization/uncertainty tool to understand the significance of uncertainty and control variables that influence the production function in addition to the analysis of normalized type curves in different real field cases. The challenges encountered during implementation of electrical heating processes in terms of production, reservoir and facilities engineering are outlined in order to provide a comprehensive and practical implementation perspective rather than only theoretical and/or simulation work. It is observed that electrical heating can be promising when applied in the right place and can bring lots of benefits not only in terms of low water-cut recovery, but also low carbon footprint and low costs associated with environmental fees. The significant parameters are listed for a robust and successful implementation of an electrical heating project. There are studies on electrical heating, but they are either outdated reflecting the old technology, or only focusing on simulation/theoretical work or only case focusing only reservoir or production aspects. This study fills the gap and provides a comprehensive look in detail in the theory, real-field practical problems and solutions from source of electricity to production of the heavy oil illustrating the costs associated that can serve as a solid reference for future implementations. 2 SPE-193707-MS
Nejadi, Siavash (University of Calgary) | Hubbard, Stephen M. (University of Calgary) | Shor, Roman J. (University of Calgary) | Gates, Ian D. (University of Calgary) | Wang, Jingyi (University of Calgary)
Steam chamber conformance in Steam Assisted Gravity Drainage (SAGD) influences the efficiency and economic performance of bitumen recovery. Conventional SAGD well completion designs provide limited control points in long horizontal well pairs leading to development of a non-ideal steam chambers. Developing advanced wellbore completions and optimizing downhole tool settings is critical to achieve optimal steam distribution in heterogeneous reservoirs for optimal recovery.
This paper presents a workflow to optimize SAGD well completion design by using flow control devices (FCDs). Optimum FCD placement, and specifications are determined in consideration of reservoir heterogeneity. Uncertainties in spatial distribution of facies and rock types, reservoir rock and fluid properties are represented by multiple equiprobable deterministic and stochastic geological realizations using Monte-Carlo simulation. The methodology is based on constrained nonlinear optimizationtomaximize the net present value (NPV) as the objective function. A coupled wellbore/reservoir simulation model of a well pad is implemented in the study, and the efficacy of different scenarios with varied well designs are assessed from evaluating bitumen production, steam injection, and well completion expenses.
Results indicate superior performance of the wells equipped with FCDs compared to conventional concentric and parallel dual string well completion designs. For the cases examined, this translates to an average 7% increase of the expected NPV for different well completion designs when using FCDs. Furthermore, results show using zonal isolation in the well design is essential for compartmentalized reservoirs such aspoint bar deposits with their significant heterogeneity.
Advanced wellbore completions provide sufficient tools to constrain steam injection and liquid production into and from different well segments, and manage steam chamber conformance along the horizontal well pairs, improve production efficiency, increase bitumen recovery, and reduce operating costs.
A novel workflow is presented to optimize advanced wellbore completions utilizing flow control devices. This integrated assisted optimization approach considers uncertainties in geological properties, and determines the optimal FCD parameters and well completion design with acceptable computational effort. This integrated workflow allowed us to undertake a thorough evaluation of the key subsurface uncertainties, and design an overall development plan. The probabilistic nature of the results legitimize quantifying the uncertainties and identify associated risks for different completion strategies.
Steam Assisted Gravity Drainage (SAGD) is used in West Canadian Sedimentary Basin (WCSB) for heavy oil recovery and numerical simulation is quantitative tool, which can be used to evaluate the SAGD performance and test developments options. This paper starts with the assumption that a single model is unlikely to be correct, and cannot represent the range in the outcomes due to the reservoir complexity and fluid heterogeneity. An alternative is to generate a number of models through an assisted history matching (AHM) process, with the intent of providing realistic estimates of the remaining uncertainty and quantifiable options to reduce the uncertainty. For a new piece of surveillance to add value, our working hypothesis is that the new surveillance should not be correlated with existing surveillance to avoid redundancy, and that the reduction in the range of acceptable models provides a way of quantifying the value. For our SAGD study, we are evaluating if temperature and pressure surveillance in observation wells has value, and how they should be included in the objective function.
A synthetic two-dimensional SAGD model was used in the simulation, with a known truth case against which we can test the efficiency of different types of surveillance to recover the true reservoir performance. Three observation wells are distributed through the model with five stations for temperature and pressure measurements. To test the value of the temperature measurements and observer well, a blind searching algorithm was use initially to avoid introducing a search bias that could distort the correlation. Rate, pressure and, temperature misfits were calculated for each experiment. The model with less than 5% rate misfit were filtered and run for several years in prediction with the same depletion plan. The improvement in the lower confidence bound for the value of the field used as the benefit of the surveillance.
A second test was to use an exploitative search technique on a simple arithmetic objective function. As expected, the global minimum did not represent the truth case performance, though the truth case was in the intersection of the surveillance constraints of matching pressure, rate and temperature within noise. The correlation between temperature and pressure surveillance show some correlation, such that there is some redundancy, yet the analysis of the models resulting from both parts showed that pressure and temperature surveillance have reduced the uncertainty in the future outcomes by 84% in cumulative oil production and 82% in the NPV.
The proposed workflow used here will allow quantifying the potential of adding a new surveillance to the field and/or specifying the number and locations of the required observation wells which properly capture the steam chamber development and pressure propagation throughout SAGD reservoir.
Khaliulin, Ruslan (RN-SakhalinNIPImorneft) | Litus, Dmitry (RN-SakhalinNIPImorneft) | Kharkovsky, Alexander (Schlumberger) | Shakhov, Dmitry (Schlumberger) | Nuriev, Ayvar (Schlumberger) | Maskaev, Pavel (Schlumberger) | Kuzmich, Andrey (Schlumberger)
Production of high viscous oil from reservoirs is associated with high level of geological and technological risks. In order to make heavy oil projects economically valuable, it is important to reach sufficient levels of oil production to cover the high costs of used technologies. It is very well known that thermal methods are the ones having the highest positive impact on the heavy oil recovery. The complexity of the physical processes while implementation of such methods represents a significant engineering challenge in terms of planning and finding optimal field development scenario. Modern modeling tool such as detailed integrated model combining the reservoir and the surface network simulators allow the engineers to model the complex physical phenomena and investigate in details different "what-if" scenarios, thus allowing them to find and implement optimal solutions, minimize risks and achieve economical profitability of complex heavy oil projects.
However, prior to application of such complex models for making engineering decisions it is crucial to ensure, that they are able to deliver stable and valid results in the entire range of the possible input parameters changes. These changes of parameters can be dictated by either uncertainty of the initial data due to limited availability and quality or by actual variation as the result of complex physical phenomena. This paper describes a challenge of modeling of the shallow heavy oil field Katangli, where for 40 years the cyclic steam injection heavy oil recovery technology has been implemented. In this work, the comprehensive thermodynamic models of the reservoir and the surface infrastructure were built in order to take into account for all necessary physical effects. These models were then combined into the single "reservoir-to-surface" integrated model in order to consider the behavior of the entire system and counter influence of the both parts on each other. While building and calibration those models several important studies were performed with each part of the integrated model, which helped to better understand the process of cycling steam injection, specificities of its application and modeling, and to address issues related to initial data availability, quality and variability and their potential impact on engineering decisions in future.
ABSTRACT: In this paper, a pseudo-component black oil reservoir model was constructed to simulate CO2 flooding in shale oil reservoirs. The CO2 flooding was conducted using two different approaches: cyclic injection with 10 wells (5 producers and 5 injectors) and Huff-n-Puff with only 5 wells for injection and production with same number and length of fractures. The cyclic injection obtained 38% more cumulative oil during the thirty-year prediction period than the Huff-n-Puff process. Next, Design of Experiments and Proxy modeling was adopted for the optimization of hydraulic fracturing design through the cyclic CO2 flooding. Four factors were considered in the optimization: fracture half-length, primary width, permeability, and effective width. Many experiments were designed by mixing the levels of these four factors using Latin Hypercube Sampling. The optimization approach significantly improved the cyclic injection cumulative oil production by 12%. Two proxy models were constructed to obtain a simplified alternative metamodel to the large reservoir simulator. The RBF Neural Network created a much more accurate matching of cumulative oil production calculated from the simulator and predicted from the proxy model than the polynomial regression. The most influential fracturing factors were also identified for their effects on the shale oil reservoir performance.
Of the astounding 400 billion barrels of shale play oil in the United States alone, less than 15% has been recovered through primary production (Yu et al., 2014). With such a large quantity of reserves still unrecovered, new and inventive Enhanced Oil Recovery (EOR) processes are required to continue production. One of the possible efficient EOR processes in unconventional reservoirs is CO2 flooding through hydraulic fractured horizontal wells. There are two main approaches of CO2 flooding: Huff-n-Puff and cyclic injection. The Huff-n-Puff process is made up of three steps: first, CO2 is injected (Huff); second, the well is shut in to allow the CO2 to propagate throughout the reservoir; third, the well is reopened and production resumes (Puff). The cyclic gas injection method is performed in the same manner with one key difference: separate wells are used for injection and production. These processes are well known in conventional sandstone reservoirs, but are very new technologies for shale formations with horizontal wells using hydraulic fracture modeling.
The application of a novel modeling and optimization approach is presented, demonstrating the impact of quantitatively optimized steam redistribution in mature heavy oil fields. Results are presented for a steamflood in the San Joaquin Basin in California, demonstrating significant savings of steam and operational costs and significant production increase, ultimately increasing net present value (NPV) by at least 10%.
The new approach, termed
The paper first describes the
For the field in this case study, production comes from poorly consolidated sands within the Antelope Shale member of the Miocene Monterey formation with porosity averaging 30%, permeability averaging 2,000 mD and net thicknesses typically between 50 and 300 feet. Structural dip is steep at approximately 60 degrees. The reservoirs are shallow, with depths ranging from 200 - 600 feet TVD. Oil gravity is approximately 13° API. Reservoir pressures are well below bubble point and average 50 - 100 PSI.
The most significant new finding is that steam redistribution can be quantitatively optimized rapidly to maximize short and long-term EOR economics. This is important, particularly for mature fields, in order to maximize recovery amidst varying commodity prices.
The novelty of the new model is its combination of speed of data integration (less than a week) and runtime (minutes) with long-term predictive accuracy (years or decades). This is due to the unique integration of reservoir physics with fast data-driven methods.