Content of PetroWiki is intended for personal use only and to supplement, not replace, engineering judgment. SPE disclaims any and all liability for your use of such content. A form of mathematical programming in which the objective function is a linear combination of the independent variables. The solution technique is called the simple method because it can be viewed as a search along the edges of a hypercube.
Decision tree analysis and Monte Carlo simulation are the most commonly used tools in decision and risk analysis. But other tools such as optimization, options analysis, and combinations of these various tools can also be useful. This article examines the importance of data analysis and the nature and application of these other tools. Regardless of the principal tool used in risk analysis--Monte Carlo simulation or decision trees--empirical data may play an important role. Estimating the probabilities and values for a decision tree is often done by examining historical data.
Since decades, steam-assisted oil recovery processes have been successfully deployed in heavy oil reservoirs to extract bitumen/heavy oil. Current resource allocation practices mostly involve reservoir model-based open loop optimization at the planning stage and its periodic recurrence. However, such decades-old strategies need a complete overhaul as they ignore dynamic changes in reservoir conditions and surface facilities, ultimately rendering heavy oil production economically unsustainable in the low-oil-price environment. Since steam supply costs account for more than 50% of total operating costs, a data-driven strategy that transforms the data available from various sensors into meaningful steam allocation decisions requires further attention.
In this research, we propose a purely data-driven algorithm that maximizes the economic objective function by allocating an optimal amount of steam to different well pads. The method primarily constitutes two components: forecasting and nonlinear optimization. A dynamic model is used to relate different variables in historical field data that were measured at regular time intervals and can be used to compute economic performance indicators (EPI). The variables in the model are cumulative in nature since they can represent the temporal changes in reservoir conditions. Accurate prediction of EPI is ensured by retraining regression model using the latest available data. Then, predicted EPI is optimized using a nonlinear optimization algorithm subject to amplitude and rate saturation constraints on decision variables i.e., amount of steam allocated to each well pad.
Proposed steam allocation strategy is tested on 2 well pads (each containing 10 wells) of an oil sands reservoir located near Fort McMurray in Alberta, Canada. After exploratory analysis of production history, an output error (OE) model is built between logarithmically transformed cumulative steam injection and cumulative oil production for each well pad. Commonly used net-present-value (NPV) is considered as EPI to be maximized. Optimization of the objective function is subject to distinct operating conditions and realistic constraints. By comparing results with field production history, it can be observed that optimum steam injection profiles for both well pads are significantly different than that of a field. In fact, the proposed algorithm provides smooth and consistent steam injection rates, unlike field injection history. Also, the lower steam-oil ratio is achieved for both well pads, ultimately translating into ~19 % higher NPV when compared with field data.
Inspired from state-of-the-art control techniques, proposed steam allocation algorithm provides a generic data-driven framework that can consider any number of well pads, EPIs, and amount of past data. It is computationally inexpensive as no numerical simulations are required. Overall, it can potentially reduce the energy required to extract heavy oil and increase the revenue while inflicting no additional capital cost and reducing greenhouse gas emissions.
The petro-elastic model (PEM) represents an integral component in the closed-loop calibration of integrated four-dimensional (4D) solutions incorporating time-lapse seismic, elastic and petrophysical rock property modeling, and reservoir simulation. Calibration of the reservoir simulation model is needed so that it is not only consistent with production history but also with the contemporaneous subsurface description as characterized by time-lapse seismic. The PEM requires dry rock properties in its description, which are typically derived from mechanical rock tests. In the absence of those mechanical tests, a small data challenge is posed, whereby all necessary data is not available but the value of reconciling seismic attributes to simulated production remains. A seismic inversion-constrained n-dimensional metaheuristic optimization technique is employed directly on three-dimensional (3D) geocellular arrays to determine elastic and density properties for the PEM embedded in the commercial reservoir simulator.
Ill-posed dry elastic and density property models are considered in a field case where the seismic inversion and petrophysical property model constrained by seismic inversion exist. An n-dimensional design optimization technique is implemented to determine the optimal solution of a multidimensional pseudo-objective function comprised of multidimensional design variables. This study investigates the execution of a modified particle swarm optimization (PSO) method combined with an exterior penalty function (EPF) with varied constraints. The proposed technique involves using n-dimensional design optimization to solve the pseudo-objective function comprised of the PSO and EPF given limited availability of constraints. In this work, an examination of heavily and reduced-order penalized metaheuristic optimization processes, where the design variables and optimal solution are derived from 3D arrays, is conducted so that constraint applicability is quantified. While the process is examined specifically for PEM, it can be applied to other data-limited modeling techniques.
Lei, Zhengdong (Research Institute of Petroleum Exploration and Development, PetroChina) | Xie, Qichao (Exploration and Development Research Institute of ChangQing Oilfield Company) | Tao, Zhen (Research Institute of Petroleum Exploration and Development, PetroChina) | He, YouAn (Exploration and Development Research Institute of ChangQing Oilfield Company) | Zhu, Zhouyuan (China university of Petroleum) | Peng, Yan (China university of Petroleum) | Liu, Canhua (China university of Petroleum)
Waterflooding of fractured low permeability reservoirs are often associated with poor sweep and high water cut due to existence of natural fractures, hydraulic fractures, and artificially induced fractures. Therefore, reservoir simulation with coupled geomechanics and dynamic fractures is required for this application. In this work, we present the use of streamline-derived flux information to improve overall waterflooding performance in such complex simulation problems.
This work shows the waterflooding optimization workflow of a fractured low-permeability reservoir in ChangQing Oilfield, China. First, the finite difference simulator considering stress field and geomechanical properties is used to simulate the growth of dynamic fractures. Then, the newly formed fracture properties are included into the dual porosity/permeability reservoir simulation model. Afterwards, streamlines can be traced based on the velocity field of this model, which represent a snapshot of the inter-well fluxes. Finally, with the goal of minimizing field water production, we implement linear programming algorithms to optimize the waterflooding operation by considering the inter-well connectivity and well allocation factors.
Through reservoir simulation coupled with geomechanics, we have found that induced fracture growth rate is relatively limited at reasonable injection rate, which is also validated by field empirical observations. This can avoid fracture propagation and reduce the risk of rapid water breakthrough. We deploy our streamline tracing and linear programming based optimization program to work together with this simulation model. A controlled and cautious increase in injection rate has resulted in a positive production response in 28 producers in the pilot area. Reallocation of water to high-efficiency injectors improves sweep efficiency in the reservoir. Finally, the optimized scenario has resulted in more than 15% incremental swept volume as compared to the basic development case.
This work provides a comprehensive case study for better understanding the impact fracture growth on waterflooding performance in fractured low-permeability reservoirs. It further establishes the workflow of using streamline-based flux information for oil production optimizations in these complex simulation problems.
The present study provides a comprehensive set of new analytical expressions to help understand and quantify well interference due to competition for flow space between the hydraulic fractures of parent and child wells. Determination of the optimum fracture spacing is a key factor to improve the economic performance of unconventional oil and gas resources developed with multi-well pads. Analytical and numerical model results are combined in our study to identify, analyze, and visualize the streamline patterns near hydraulic fractures, using physical parameters that control the flow process, such as matrix permeability, hydraulic fracture dimensions and assuming infinite fracture conductivity. The algorithms provided can quantify the effect of changes in fracture spacing on the production performance of both parent and child wells. All results are based on benchmarked analytical methods which allow for fast computation, making use of Excel-based spreadsheets and Matlab-coded scripts. Such practical tools can support petroleum engineers in the planning of field development operations. The theory is presented with examples of its practical application using field data from parent and child wells in the Eagle Ford shale (Brazos County, East Texas). Based on our improved understanding of the mechanism and intensity of production interference, the fracture spacing (this study) and inter-well spacing (companion study) of multifractured horizontal laterals can be optimized to effectively stimulate the reservoir volume to increase the overall recovery factor and improve the economic performance of unconventional oil and gas properties.
A high risk of suboptimal well placement exists in new field development where seismic uncertainty can be great. Recent ultradeep resistivity measurement developments provide great benefits for identifying and optimizing the well path position within a given stratigraphic sequence. This paper presents a case study in which an operator planned to place wells 10 m TVD below the reservoir top because of seismic uncertainty of the top reservoir pick. To help mitigate this subsurface risk, the field development plan required real-time well placement optimization, using both standard formation evaluation data and an ultradeep azimuthal resistivity service. In this case-history, the ultradeep inversion canvases could be used to identify the well path position within the reservoir, as well as provide sufficient confidence to steer the well closer to the reservoir top than originally planned.
Multiple geological models, created from nearby offset wells and seismic grids, represented the expected seismic uncertainty of 5 to 15 m TVD. To identify the optimal measurement setup for real-time operations, resistivity modelling illustrated the effect of frequency and spacing on the data, producing multiple inversions for each geological scenario. After drilling began, real-time inversions for the ultradeep resistivity data were initially qualified using standard formation evaluation data, including both deep azimuthal resistivity and azimuthal density images. Multiple inversion canvases from various spacings and frequencies identified several formation features, including distances to the top and base of the reservoir. The quantified uncertainty of these results assisted in the evaluation of the inversion quality.
When close to the reservoir top, the wellbore position indicated in the ultradeep inversion canvases matched the interpretation from the conventional logs, which provided increased confidence in the inversion canvas results at distances farther away. This enhanced reservoir knowledge enabled the operator to progressively raise the well path to 5 and to 2 m TVD from the reservoir top. Except for strategic geosteering decisions based on expected faults positions from the seismic data, the operator made most well-placement decisions, across multiple wells, using ultradeep resistivity data. The high data quality and close collaboration within the subsurface team quickly led to high confidence in the inversion results. Integrating the full suite of available data, from shallow to ultradeep measurements in a comprehensive interpretation, provided better reservoir understanding, resulting in optimal well placement.
This paper presents formation evaluation results used within an integrated well-placement optimization service from a new field development. The integrated data qualified the results for an ultradeep resistivity tool. Confidence in the tool results enabled the operator to place wells much closer to the reservoir top than initially planned, in an area of seismic uncertainty.
Integration of time-lapse seismic data into dynamic reservoir model is an efficient process in calibrating reservoir parameters update. The choice of the metric which will measure the misfit between observed data and simulated model has a considerable effect on the history matching process, and then on the optimal ensemble model acquired. History matching using 4D seismic and production data simultaneously is still a challenge due to the nature of the two different type of data (time-series and maps or volumes based).
Conventionally, the formulation used for the misfit is least square, which is widely used for production data matching. Distance measurement based objective functions designed for 4D image comparison have been explored in recent years and has been proven to be reliable. This study explores history matching process by introducing a merged objective function, between the production and the 4D seismic data. The proposed approach in this paper is to make comparable this two type of data (well and seismic) in a unique objective function, which will be optimised, avoiding by then the question of weights. An adaptive evolutionary optimisation algorithm has been used for the history matching loop. Local and global reservoir parameters are perturbed in this process, which include porosity, permeability, net-to-gross, and fault transmissibility.
This production and seismic history matching has been applied on a UKCS field, it shows that a acceptalbe production data matching is achieved while honouring saturation information obtained from 4D seismic surveys.
Scapolo, Matteo (KIDOVA France & Imperial College London) | Garcia, Michel H. (KIDOVA France) | Mathieu, Jean-Baptiste (KIDOVA France) | Siffert, Deborah (KIDOVA France) | Gosselin, Olivier R. (Imperial College London) | Ackerer, Philippe (LHyGeS)
Using geostatistical modelling to populate reservoir properties is nowadays the most common approach in the industry and it has received a great deal of attention. A geostatistical reservoir model defines a space of spatial uncertainty, which can be explored by generating many equiprobable reservoir property realisations that are as many possible reservoir models complying with static data. Among them, the relevant models are those that also match the dynamic data, which complete the available data for reservoir model calibration. Finding the relevant reservoir models in the space of spatial uncertainty is a time-consuming process that requires simulating the dynamic (flow) response of many reservoir models. Having a fast and reliable simulation method is then highly desirable to speed up the process of reservoir model calibration. In this context, a new approach has been developed and tested. The method allows easy and fast comparison between interpreted well-test results and equivalent (average) reservoir model properties in terms of transmissivity (k.h) and permeability. The comparison can be used to validate or reject a reservoir model, and to obtain indications on how to modify it to fit the well-test data. This paper presents the method and the results obtained to evaluate its performances and to validate it.
Well-test-interpreted permeabilities (or transmissivities) are nothing but weighted average permeabilities that are to be calculated from permeabilities defined over closed surfaces properly defined around the well, the weights depending on the flow geometry. The proposed method is based on steady-state flow simulation that is carried out by making the tested well a source term (producing or injecting well) at the centre of a simulation domain (reservoir model region). The latter must be extended enough to contain, or at least overlap, the stabilisation area of the well test in which average transmissivities are to be estimated. The method relies on three key aspects: defining a simulation domain (extension and shape) that is consistent with the actual well-test drainage area, defining relevant boundary conditions to reproduce flow paths that are consistent with those generated by the actual well test, using the new effective-gradient based averaging method to compute average permeabilities over closed surfaces properly defined.
The method is tested on various synthetic and partly real field cases, for which the transient well-test responses are first simulated and interpreted, then compared with the transmissivities, predicted using the new method. Sensitivity analysis is also carried out on calculation parameters (flow simulation domain, flow rates…) to check the robustness of the method and identify improvement avenues. All these results tend to confirm the effectiveness of the method, which can combine speed and accuracy. This method is intended to be used as an objective function to perform automatic or assisted reservoir model calibration on interpreted well-test data. It is expected to be particularly useful to calibrate naturally fractured reservoir models for which permeability tensors are to be calculated from uncertain locally defined fracture property statistics.
You, Junyu (Petoleum Recovery Research Center) | Ampomah, William (Petoleum Recovery Research Center) | Kutsienyo, Eusebius Junior (Petoleum Recovery Research Center) | Sun, Qian (Petoleum Recovery Research Center) | Balch, Robert Scott (Petoleum Recovery Research Center) | Aggrey, Wilberforce Nkrumah (KNUST) | Cather, Martha (Petoleum Recovery Research Center)
This paper presents an optimization methodology on field-scale numerical compositional simulations of CO2 storage and production performance in the Pennsylvanian Upper Morrow sandstone reservoir in the Farnsworth Unit (FWU), Ochiltree County, Texas. This work develops an improved framework that combines hybridized machine learning algorithms for reduced order modeling and optimization techniques to co-optimize field performance and CO2 storage.
The model's framework incorporates geological, geophysical, and engineering data. We calibrated the model with the performance history of an active CO2 flood data to attain a successful history matched model. Uncertain parameters such as reservoir rock properties and relative permeability exponents were adjusted to incorporate potential changes in wettability in our history matched model.
To optimize the objective function which incorporates parameters such as oil recovery factor, CO2 storage and net present value, a proxy model was generated with hybridized multi-layer and radial basis function (RBF) Neural Network methods. To obtain a reliable and robust proxy, the proxy underwent a series of training and calibration runs, an iterative process, until the proxy model reached the specified validation criteria. Once an accepted proxy was realized, hybrid evolutionary and machine learning optimization algorithms were utilized to attain an optimum solution for pre-defined objective function. The uncertain variables and/or control variables used for the optimization study included, gas oil ratio, water alternating gas (WAG) cycle, production rates, bottom hole pressure of producers and injectors. CO2 purchased volume, and recycled gas volume in addition to placement of new infill wells were also considered in the modelling process.
The results from the sensitivity analysis reflect impacts of the control variables on the optimum results. The predictive study suggests that it is possible to develop a robust machine learning optimization algorithm that is reliable for optimizing a developmental strategy to maximize both oil production and storage of CO2 in aqueous-gaseous-mineral phases within the FWU.