Chemical enhanced oil recovery (EOR) methods have received increased attention in recent years since they have the ability to recover the capillary trapped oil. Successful chemical flooding application requires accurate numerical models and reliable forecast across multiple scales: core scale, pilot scale, and field scale. History matching and optimization are two key steps to achieve this goal.
For history matching chemical floods, we propose a general workflow for multi-stage model calibration using an Evolutionary Algorithm. A comprehensive chemical flooding simulator is used to model important physical mechanisms including phase behavior, cation exchange, chemical and polymer adsorption and capillary desaturation. First, we identify dominant reservoir and process parameters based on a sensitivity analysis. The history matching is then carried out in a stage-wise manner whereby the most dominant parameters are calibrated first and additional parameters are incorporated sequentially until a satisfactory data misfit is achieved. Next, a diverse subset of history matched models is selected for optimization using a Pareto-based multi-objective optimization approach. Based on the concept of dominance, Pareto optimal solutions are generated representing the trade-off between increasing oil recovery while improving the efficiency of chemical usage. These solutions are searched using a Non-dominated Sorting Genetic Algorithm (NSGA-II). Finally we implement a History Matching Quality Index (HMQI) with Moving Linear Regression Analysis to evaluate simulation results from history matching process. The HMQI provides normalized values for all objective functions having different magnitude and leads to a more consistent and robust approach to evaluate the updated models through model calibration.
Multistage hydraulic fracturing of a horizontal well in an unconventional reservoir tends to induce a complex fracture network (CFN) which is challenging to characterize by conventional methods. In this work, we develop a fracture characterization workflow to estimate the geometric configuration and fracture properties of a CFN by assimilating microseismic event data and production data, sequentially.
A novel stochastic fractal model, that is consistent with rock physics and outcrop observations, is developed in order to generate realizations of the complex fracture network. In the first stage of the two-stage assisted history matching workflow, we estimate the parameters of the stochastic fractal model (fracture intensity, average fracture length, orientation and fracture distribution) by using a genetic algorithm to history match data for the locations of microseismic events. In the second stage, the production data from the shale reservoir are assimilated by the ES-MDA algorithm to estimate the stimulated reservoir volume (SRV) and its average permeability, fracture permeability, aperture and porosity. In the unconventional shale gas reservoir simulator used as the forward model, large-scale fractures are modeled via the embedded discrete fracture model (EDFM) and a dual-porosity, dual-permeability (DP-DK) model is used for modeling the SRV and small scale fractures. The simulator includes Knudsen diffusion and the Langmuir adsorption/desorption model.
For validation, we consider a synthetic shale gas reservoir with a horizontal well that has been stimulated by multistage hydraulic fracturing. A particular realization of the variables that describe the reservoir model is used to generate observed data for microseismic events and production rates. The parameters to be adjusted to match the observed microseismic events are the expected values of the length, orientation and intensity of the distribution of the natural fractures and the fractal pattern. Results show that we obtain good estimates of the expected value of natural fracture length, orientation, intensity and fracture distribution by history matching observations of locations of microseismic events. These estimates provide an updated stochastic fractal model for the configuration of CFN. The history-matched fractal model is used to generate an ensemble of fracture distributions consistent with microseismic data as candidate fracture configurations when estimating fracture properties by matching production data. We obtain much better history matches, future performance predictions, estimates of stimulated reservoir volume and its average permeability and estimates of fracture permeability, porosity and aperture when we match both microseismic and production data than we only match production data. When both seismic and production data are matched for synthetic cases and parameters are properly scaled, the true values of parameters and reservoir performance predictions are within the P25-P75 confidence intervals calculated from the ensemble of history matched models in virtually all cases.
In practice, the proper characterization of the CFN and reservoir properties should be useful for placing new wells and designing fracture treatments.
Geophysical Reservoir Monitoring GRM systems such 4D seismic are increasingly used in the oil and gas industry because they provide unique and useful information on fluid movement within the reservoir. This information is relevant for many reservoir management decisions; including new well placement, well intervention, and reservoir model updating.
Unfortunately, it has been difficult to estimate the value creation of any data acquisition scheme due to the fact that a multidisciplinary approach is required to model the value that future measurements will imply in future decisions. This assessment requires a common decision making simulation frame work that can integrate the input from geo-modelers, geophysicist and reservoir engineers.
This work presents an example of how a Close Loop Reservoir Management (CLRM) simplification can be used as a framework for simulating NPV changes due to assimilation of production and saturations in a simple toy model. It combines state-of-the-art data assimilation and uncertainty modeling methods with a robust optimization genetic algorithm to calculate NPV improvements due to model update and its relationship with the NPV obtained from the synthetic reservoir.
In this context a simple synthetic model is presented. It recreates a segment of green field under a strong aquifer influence with two discovery wells. The reservoir development requires the selection of 4 well locations at fixed drilling times. The development strategy selection is obtained with the use of a genetic algorithm within the CLRM framework. Subsequently two cases are presented: one of assimilating only production after the first two wells have been drilled, just before deciding the locations of the last two wells; and a second case, in which production and saturation are assimilated at the same time. The saturation map assimilated is assumed to be output of a 4D seismic acquisition. The model update imposes the need of optimally relocate the last two wells which results in a NPV change.
The results show how the obtained NPVs is incremented by the relocation of the last two wells in both cases. A bigger increment is obtained when both, production and saturation are assimilated. In addition, the ensemble improved its forecast capability the most, when saturation assimilation is included. Nevertheless, the ensemble expected NPV decreases after assimilation from the value obtained from the first development strategy optimization; this indicates an optimistic early NPV valuation due to the initial ensemble distributions spread.
The study presents an asset simulation framework that could be used to evaluate data acquisition investments through the systematic modeling of reservoir uncertainties with in a decision oriented focus. This could include the inclusion of additional uncertain model parameters, the insertion of water injector and well conversions, the assimilation of saturations at different intervals, the change on the quality of the saturation maps assimilated, in addition to sensitivity studies of other economic constrains.
Coupled reservoir flow and geomechanics has numerous important applications in the oil & gas industry, such as land subsidence, hydraulic fracturing, fault reaction and hydrocarbon recovery etc. High fidelity numerical schemes and multiphysics models must be coupled in order to simulate these processes and their interactions accurately and efficiently. Specifically, in the applications of CO2 sequestration, the effect of geomechanics on carbon storage estimation is not negligible. However, coupled flow-geomechanics simulations are very computationally expensive and most of the computational time is usually spent for geomechanics calculations. This paper investigates a three-way coupling algorithm that uses an error indicator to determine when displacement must be updated and whether fixed-stress iterative coupling technique is required. Numerical experiments with coupled nonlinear single-phase flow and linear poromechanics shows that the three-way coupling algorithm can speed up 4 times comparing to fixed-stress iterative coupling algorithm. Extensions to coupled compositional flow with poromechanics also shows a speed-up for 5 times for continuous CO2 sequestration applications and 2 times for surfactant-alternating-gas applications (SAG). The substantial speed up makes the three-way coupling algorithm of flow and geomechanics feasible in the large-scale optimizations. Based on the three-way coupling of compositional flow and geomechanics, we experimented two black box optimization algorithms, covariance-matrix adaptation evolution strategy (CMA-ES) and genetic algorithm (GA), for the optimization of well controls during SAG process to maximize CO2 storage volume. CMA-ES outperforms GA in that it is more robust, and it achieves higher objective function value in less simulation runs. The optimized SAG process achieves 27.55% more CO2 storage volume and reduces water and surfactant consumption by 54.84%.
Motie, Mohadeseh (Tarh O Palayesh Engineering Co.) | Moein, Peyman (Tarh O Palayesh Engineering Co.) | Moghadasi, Ramin (Department of Earth Sciences, Uppsala University) | Hadipour, Ali (Tarh O Palayesh Engineering Co.)
Once reservoir fluid is brought to the surface it is separated to discrete streams of oil and gas in production unit. This should be conducted with consideration of maximizing oil production and to decreasing total costs as well. The regulation of separators pressure is a key point to achieve the highest oil recovery in this process. In addition, estimating the optimum number of separation stages, would augment the design of production plant economically. Adjusting separator pressure is usually conducted by employing PVT data which are provided through flash or equilibrium separation tests. In case of experimental tests unavailability, an alternative method could be empirical correlations. However, such tentative methods contain inaccuracy and are also time consuming which may limit their utility. In this study the issue was addressed by applying genetic algorithm (GA) as well as flash calculation through a case study on one of Iranian oil field. It has been tried to optimize set point pressures of each stage and enhance liquid share of the fluid with respect to minimized gas oil ratio (GOR). Prior to this, four scenarios were considered for determining the number of separation stages. To determine the prime scenario, a cost evaluation was implemented which leaded to the selection of four-separator scenario as the most economical case.
Sailboat routing consists in computing the best route for a sailboat taking into account the characteristics of the ship and environmental data such as weather forecast. In the context of sailing races, the best route computation is usually based on the isochrone algorithm, a sub-optimal solution to optimize the time to destination (TtD) criterion by computing a route as a sequence of waypoints. In this paper, we propose to compute a set of possible routes by considering two criteria: the time to destination and the stress. The time to destination is evaluated according to weather forecast and boat polar diagrams. The stress function is a combination of human and environmental factors. The set of possible routes are then obtained by using an iterative multiple objective optimization algorithm. Isochrone algorithm is used for initializing the set of routes. Then mutation operators are used to explore alternative solutions. Applied to realistic test cases, our search strategy allows to obtain routes with very different characteristics in terms of time to destination and stress values, asserted by experimented sailors. Concerning the main objective of minimizing time to destination, we are competitive with commercial software such as MaxSea or Adrena.
Vazquez, Oscar (Heriot Watt University ) | Ross, Gill (Chrysaor) | Jordan, Myles Martin (Nalco Champion) | Baskoro, Dionysius Angga Adhi (Heriot-Watt University) | Mackay, Eric (Heriot-Watt University) | Johnston, Clare (Nalco Champion) | Strachan, Alistair (Nalco Champion)
Oilfield-scale deposition is one of the important flow-assurance challenges facing the oil industry. There are a number of methods to mitigate oilfield scale, such as reducing sulfates in the injected brine, reducing water flow, removing damage by using dissolvers or physically by milling or reperforating, and inhibition, which is particularly recommended if a severe risk of sulfate-scale deposition is present. Inhibition consists of injecting a chemical that prevents the deposition of scale, either by stopping nucleation or by retarding crystal growth. The inhibiting chemicals are either injected in a dedicated continuous line or bullheaded as a batch treatment into the formation, commonly known as a scale-squeeze treatment. In general, scale-squeeze treatments consist of the following stages: preflush to condition the formation or act as a buffer to displace tubing fluids; the main treatment, where the main pill of chemical is injected; overflush to displace the chemical deep into the reservoir; a shut-in stage to allow further chemical retention; and placing the well back in production. The well will be protected as long as the concentration of the chemical in the produced brine is greater than a certain threshold, commonly known as minimum inhibitor concentration (MIC). This value is usually between 1 and 20 ppm. The most important factor in a squeeze-treatment design is the squeeze lifetime, which is determined by the volume of water or days of production where the chemical-return concentration is greater than the MIC.
The main purpose of this paper is to describe the automatic optimization of squeeze-treatment designs using an optimization algorithm, in particular particle-swarm optimization (PSO). The algorithm provides a number of optimal designs, which result in squeeze lifetimes close to the target. To determine the most efficient design of the optimal designs identified by the algorithm, the following objectives were considered: operational-deployment costs, chemical cost, total-injected-water volume, and squeeze-treatment lifetime. Operational-deployment costs include the support vessel, pump, and tank hire. There might not be a single design optimizing all objectives, and thus the problem becomes a multiobjective optimization. Therefore, a number of Pareto optimal solutions exist. These designs are not dominated by any other design and cannot be bettered. Calculating the Pareto is essential to identify the most efficient design (i.e., the most cost-effective design).
Heel-dominated treatment distribution among multiple perforation clusters is frequently observed in plug-and-perf stages, causing small propped surface areas, suboptimal production, and unexpected frac-hits.
A multi-fracture simulator with a novel wellbore fluid and proppant transport model is applied to quantify treatment distribution among multiple perforation clusters in a plug-and-perf operation. A simulation Base Case is set up based on a field treatment design with four clusters. Simulation results show that the two toe-side clusters screened out early in the treatment and the two heel-side clusters were dominant. The simulated proppant placement is consistent with DAS observations.
The impact of different perforating strategies and pumping schedules on final treatment distribution is investigated. Two criteria are defined that quantify the proppant distribution and fracture area: the Weighted Average (WA) and Standard Deviation (SD) of the final fluid and proppant distribution, as well as the Hydraulic and Propped Surface Area (HSA and PSA) of the created fractures. An optimum plug-and-perf design is defined as one that minimizes the SD of the treatment distribution among perforation clusters and maximizes the PSA.
Both perforating strategy and pumping schedule are found to affect the final treatment distribution significantly, and uniform treatment distribution is shown to create more PSA. Fewer perforations-per-cluster were found to promote uniform fluid and proppant placement. Other helpful strategies include reducing the number of perforations near the heel, using small, lightweight proppant and so on. The stress shadow effect is accounted for using the Displacement Discontinuity Method (DDM) and was found to play a smaller role than perforation friction and proppant inertia in most cases.
An automated process is developed to optimize plug-and-perf completion design with multiple decision variables using a Genetic Algorithm. Thirteen parameters are optimized simultaneously. The optimal design solution creates an almost even treatment distribution and more than doubled the PSA compared to the Base Case.
The multi-fracture model presented in this paper provides a way to quantify fluid and proppant distribution for any perforating strategy and pumping schedule and provides more insights of the physics relevant to plug-and-perf treatment distribution. The perforation and pumping schedule recommendations presented in this paper provide directional guidance to design a fracturing job of balanced treatment distribution and large propped surface area.
A major challenge in carbonate reservoirs is the highly-fractured nature of the rock. The flow rate may be high or low depending on the targeted fracture clusters. In addition, it is possible that flow rates vary from one region of the reservoir to another. Smart wells furnished with smart completion strategy presents great prospects to produce such reservoirs intelligently, thereby, helping to deal with heterogeneities rather smartly. It is established that early water break-through occurs when multi-lateral wells are completed with constant choke settings, and therefore one way to mitigate this problem is using smart completions that manage the unexpected production through fractures, thereby increasing ultimate recovery. The early water breakthrough is obvious because if a lateral section intersects a clusters of fracture zone, there is a possibility that these fractures may connect with the water zone that may trigger the breakthrough. This can be managed by preferentially regulating production from manifold laterals.
The evident communication among the various laterals of the mother bore raises difficulty in optimizing the production from the variable productivity intervals. In theory, the optimization scheme of smart completion involves different constraints, nevertheless, the settings of the smart inflow control valve (ICV) is the single most important parameter that may prove to be the differentiating factor between a high producing well to a poorly producing one. This study engrosses its effort on the reservoir engineering characteristics of finding the optimum choke setting that would lead to maximum recovery.
Computational Intelligence through Particle Swarm Optimization (PSO) is utilized as the integral algorithm to determine the optimal ICV configuration for a fishbone well in a naturally fractured carbonate reservoir. A commercial black oil simulator was used to determine the objective function; whose role here is to evaluate the fitness of a configuration of the choke; this was carried out under a workflow programmed in the MATLAB programming language that coupled the optimization algorithm with the numerical simulator. A single fishbone well, having 15 laterals was studied in order to see the effect of the fracture network on the water breakthrough and consequent impact on recovery.
Three different scenarios are developed to see the impact of optimization; a base case employing only multilateral well technology without the smart well completion, a smart well completion scheme with no optimization and finally the optimized smart well completion. The results very sequentially clarify the need for not only optimization but also highlights the role of intelligent completions for wells in the reservoir being studied. It is evident that without using smart wells, the water breakthrough is relatively earlier and produces less hydrocarbons, but as the use of smart wells is incorporated, the results start improving and for complete optimization scheme of the ICVs, it is observed that the recovery has increased by almost 80% from 21% to 38%. Moreover, the time to water breakthrough and eventually the cumulative water cut has also been managed quiet significantly.
Azevedo, Leonardo (Cerena/Decivil, Instituto Superior Técnico) | Demyanov, Vasily (Institute of Petroleum Engineering, Heriot-Watt University) | Lopes, Diogo (Cerena/Decivil, Instituto Superior Técnico) | Soares, Amílcar (Cerena/Decivil, Instituto Superior Técnico) | Guerreiro, Luis (Partex Oil & Gas)
Geostatistical seismic inversion uses stochastic sequential simulation and co-simulation as the perturbation techniques to generate and perturb elastic models. These inversion methods allow retrieve high-resolution inverse models and assess the spatial uncertainty of the inverted properties. However, they assume a given number of a priori parametrization often considered known and certain, which is exactly reproduce in the final inverted models. This is the case of the top and base of main seismic units to which regional variogram models and histrograms are assigned. Nevertheless, the amount of existing well-log data (i.e., direct measurements) of the property to be inverted if often not enough to model variograms and its histograms are biased towards the more sand-prone facies. This work shows a consistent stochastic framework that allows to quantify uncertainties on these parameters which are associated with large-scale geological features. We couple stochastic adaptive sampling (i.e., particle swarm optimization) with global stochastic inversion to infer three-dimensional acoustic impedance from existing seismic reflection data. Key uncertain geological parameters are first identified, and reliable a priori distributions inferred from geological knowledge are assigned to each parameter. The type and shape of each distribution reflects the level of knowledge about this parameter. Then, particle swarm optimization is integrated as part of an iterative geostatistical seismic inversion methodology and these parameters are optimized along with the spatial distribution of acoustic impedance. At the end of the iterative procedure, we retrieve the best-fit inverse model of acoustic impedance along with the most probable value for the location of top and base of each seismic unit, the most likely histogram and variogram model per zone. We couple stochastic adaptive sampling (i.e., particle swarm optimization) with global stochastic inversion to infer three-dimensional acoustic impedance from existing seismic reflection data. Key uncertain geological parameters are first identified, and reliable a priori distributions of potential values are assigned to each parameter. The type and shape of each distribution reflects the level of knowledge about this parameter. Then, particle swarm optimization is integrated as part of an iterative geostatistical seismic inversion methodology and these parameters are optimized along with the spatial distribution of acoustic impedance. At the end of the iterative procedure we retrieve the best-fit inverse model of acoustic impedance along with the most probable value for the location of top and base of each seismic unit, the most likely histogram and variogram model per zone.