We interpreted a series of single-well-chemical-tracer-tests (SWCTTs) estimating residual oil (SORW) to base high salinity waterflood, low salinity waterflood and subsequent polymer flood conducted on a Greater Burgan well. Interpretation of the tests requires history matching of the back-production of partitioning and non-partitioning tracers which is impacted by differing amounts of irreversible flow and differing amounts of dispersion as well as the amount of residual oil.
We applied the state-of-the-art chemical reservoir simulator (UTCHEM) and an assisted history matching tool (BP’s Top-Down-Reservoir-Modeling) to interpret the tests and accurately quantify uncertainty in residual oil saturations post high salinity, low salinity, and polymer floods. Two optimization algorithms (i.e., Genetic algorithm (GA) and Particle-Swarm-Optimization (PSO)-Mesh-Adaptive-Direct-Search (MADS) algorithms) were applied to better address the uncertainty.
Our results show a six saturation unit decrease in SORW post low salinity with no change to the SORW post polymer. This is in-line with our expectations - we expect no change in SORW post-polymer as the conventional HPAM, which does not exhibit visco-elastic behavior, was used in the test. We demonstrate that history matching the back-produced tracer profiles is a robust approach to estimate the SORW by showing that three-or four-layer simulation model assumption does not change the SORW estimated. We accounted for the uncertainty in partition-coefficient in our uncertainty estimates.
We present several innovations that improve history matching back-produced tracer profiles; hence, better SORW estimations (e.g., different level of dispersivity for individual simulation layers to account for different heterogeneity level as opposed to assuming a single dispersion for all layers). We generate more robust estimates of uncertainty by finding a range of alternative history matches all of which are consistent with the measured data.
The traditional trial and error approach of history matching to obtain an accurate model requires engineers to control each uncertain parameter and can be quite time consuming and inefficient. However, automatic history matching (AHM), assisted by computers, is an efficient process to control a large number of parameters simultaneously by an algorithm that integrates a static model with dynamic data to minimize a misfit for improving reliability. It helps to reduce simulation run time as well.
Particle Swarm Optimization (PSO) is a population based stochastic algorithm that can explore parameter space combined with the least squares single objective function. The process of AHM can adopt parameterization and realization methods to reduce inverse problems. In this study, realizations of various reservoir properties such as porosity, net to gross, relative permeability, horizontal and vertical permeability, and aquifer size were chosen for controlling throughout the AHM. History matching was conducted to validate the efficiency of each method. The guidelines for optimized AHM with a stochastic algorithm are also disccussed.
The realization and parameterization methods improved matching results in a full-field application with resulting in a reduced misfit and in less. A stochastic algorithm generates multiple models to deduce control parameters to reduce a misfit. In this study we identified that PSO converged effectively with updated control parameters. The optimized AHM improved the accuracy of a full-field model although some misfit remained in the match to bottomhole pressure.
We found that updating with too many parameters makes the problem difficult to solve while using too few leads to false convergence. In addition, while the simulation run time is critical, a full-field simulation model with reduced computational overhead is benefitial.
In this study, we observed that the PSO was an efficient algorithm to update control parameters to reduce a misfit. Using the parameterization and realization as an assisted method helped find better results. Overall this study can be used as a guideline to optimize the process of history matching.
Time-lapse seismic monitoring is a powerful technique for reservoir management and the optimization of hydrocarbon recovery. In time-lapse seismic datasets, the difference in seismic properties across different vintages enables the detection of spatio-temporal changes in saturated properties and structure induced by production. The main objectives are (1) to identify bypass pay zones in time-lapse seismic data for the deepwater Amberjack field, located in the Gulf of Mexico, (2) confirm the identified bypass pay zones in the results of reservoir simulation, and (3) recommend well planning strategies to exploit these bypassed resources.
A high-fidelity seismic-to-simulation 4D workflow that incorporates seismic, petrophysics, petrophysical property modeling, and reservoir simulation was employed, which leveraged cross-discipline interaction, interpretation, and integration to extend asset management capabilities. The workflow addresses geology (well log interpretation and framework development), geophysics (seismic interpretation, velocity modeling, and seismic inversion), and petrophysical property modeling (earth models and co-located co-simulation of petrophysical properties with P-impedance from seismic inversion). An embedded petro-elastic model (PEM) in the reservoir simulator is then used to affiliate spatial dry rock properties with saturation properties to compute dynamic elastic properties, which can be related to multi-vintage P-impedance from time-lapse seismic inversion. In the absence of the requisite dry rock properties for the PEM, a small data engine is used to determine these absent properties using metaheuristic optimization techniques. Specifically, two particle swarm optimization (PSO) applications, including an exterior penalty function (EPF), are modified resulting in the development of nested and average methods, respectively. These methods simultaneously calculate the missing rock parameters (dry rock bulk modulus, shear modulus, and density) necessary for dynamic, embedded P-impedance calculation in the history-constrained reservoir simulation results. Afterward, a graphic-enabled method was devised to appropriately classify the threshold to discriminate non-reservoir (including bypassed pay) and reservoir from the P-impedance difference. Its results are compared to unsupervised learning (k-means clustering and hierarchical clustering). From seismic data, one can identify bypassed pay locations, which are confirmed from reservoir simulation after conducting a seismic-driven history match. Finally, infill wells are planned, and then modeled in the reservoir simulator.
Vazquez, Oscar (Heriot Watt University ) | Ross, Gill (Chrysaor) | Jordan, Myles Martin (Nalco Champion) | Baskoro, Dionysius Angga Adhi (Heriot-Watt University) | Mackay, Eric (Heriot-Watt University) | Johnston, Clare (Nalco Champion) | Strachan, Alistair (Nalco Champion)
Oilfield-scale deposition is one of the important flow-assurance challenges facing the oil industry. There are a number of methods to mitigate oilfield scale, such as reducing sulfates in the injected brine, reducing water flow, removing damage by using dissolvers or physically by milling or reperforating, and inhibition, which is particularly recommended if a severe risk of sulfate-scale deposition is present. Inhibition consists of injecting a chemical that prevents the deposition of scale, either by stopping nucleation or by retarding crystal growth. The inhibiting chemicals are either injected in a dedicated continuous line or bullheaded as a batch treatment into the formation, commonly known as a scale-squeeze treatment. In general, scale-squeeze treatments consist of the following stages: preflush to condition the formation or act as a buffer to displace tubing fluids; the main treatment, where the main pill of chemical is injected; overflush to displace the chemical deep into the reservoir; a shut-in stage to allow further chemical retention; and placing the well back in production. The well will be protected as long as the concentration of the chemical in the produced brine is greater than a certain threshold, commonly known as minimum inhibitor concentration (MIC). This value is usually between 1 and 20 ppm. The most important factor in a squeeze-treatment design is the squeeze lifetime, which is determined by the volume of water or days of production where the chemical-return concentration is greater than the MIC.
The main purpose of this paper is to describe the automatic optimization of squeeze-treatment designs using an optimization algorithm, in particular particle-swarm optimization (PSO). The algorithm provides a number of optimal designs, which result in squeeze lifetimes close to the target. To determine the most efficient design of the optimal designs identified by the algorithm, the following objectives were considered: operational-deployment costs, chemical cost, total-injected-water volume, and squeeze-treatment lifetime. Operational-deployment costs include the support vessel, pump, and tank hire. There might not be a single design optimizing all objectives, and thus the problem becomes a multiobjective optimization. Therefore, a number of Pareto optimal solutions exist. These designs are not dominated by any other design and cannot be bettered. Calculating the Pareto is essential to identify the most efficient design (i.e., the most cost-effective design).
A major challenge in carbonate reservoirs is the highly-fractured nature of the rock. The flow rate may be high or low depending on the targeted fracture clusters. In addition, it is possible that flow rates vary from one region of the reservoir to another. Smart wells furnished with smart completion strategy presents great prospects to produce such reservoirs intelligently, thereby, helping to deal with heterogeneities rather smartly. It is established that early water break-through occurs when multi-lateral wells are completed with constant choke settings, and therefore one way to mitigate this problem is using smart completions that manage the unexpected production through fractures, thereby increasing ultimate recovery. The early water breakthrough is obvious because if a lateral section intersects a clusters of fracture zone, there is a possibility that these fractures may connect with the water zone that may trigger the breakthrough. This can be managed by preferentially regulating production from manifold laterals.
The evident communication among the various laterals of the mother bore raises difficulty in optimizing the production from the variable productivity intervals. In theory, the optimization scheme of smart completion involves different constraints, nevertheless, the settings of the smart inflow control valve (ICV) is the single most important parameter that may prove to be the differentiating factor between a high producing well to a poorly producing one. This study engrosses its effort on the reservoir engineering characteristics of finding the optimum choke setting that would lead to maximum recovery.
Computational Intelligence through Particle Swarm Optimization (PSO) is utilized as the integral algorithm to determine the optimal ICV configuration for a fishbone well in a naturally fractured carbonate reservoir. A commercial black oil simulator was used to determine the objective function; whose role here is to evaluate the fitness of a configuration of the choke; this was carried out under a workflow programmed in the MATLAB programming language that coupled the optimization algorithm with the numerical simulator. A single fishbone well, having 15 laterals was studied in order to see the effect of the fracture network on the water breakthrough and consequent impact on recovery.
Three different scenarios are developed to see the impact of optimization; a base case employing only multilateral well technology without the smart well completion, a smart well completion scheme with no optimization and finally the optimized smart well completion. The results very sequentially clarify the need for not only optimization but also highlights the role of intelligent completions for wells in the reservoir being studied. It is evident that without using smart wells, the water breakthrough is relatively earlier and produces less hydrocarbons, but as the use of smart wells is incorporated, the results start improving and for complete optimization scheme of the ICVs, it is observed that the recovery has increased by almost 80% from 21% to 38%. Moreover, the time to water breakthrough and eventually the cumulative water cut has also been managed quiet significantly.
Azevedo, Leonardo (Cerena/Decivil, Instituto Superior Técnico) | Demyanov, Vasily (Institute of Petroleum Engineering, Heriot-Watt University) | Lopes, Diogo (Cerena/Decivil, Instituto Superior Técnico) | Soares, Amílcar (Cerena/Decivil, Instituto Superior Técnico) | Guerreiro, Luis (Partex Oil & Gas)
Geostatistical seismic inversion uses stochastic sequential simulation and co-simulation as the perturbation techniques to generate and perturb elastic models. These inversion methods allow retrieve high-resolution inverse models and assess the spatial uncertainty of the inverted properties. However, they assume a given number of a priori parametrization often considered known and certain, which is exactly reproduce in the final inverted models. This is the case of the top and base of main seismic units to which regional variogram models and histrograms are assigned. Nevertheless, the amount of existing well-log data (i.e., direct measurements) of the property to be inverted if often not enough to model variograms and its histograms are biased towards the more sand-prone facies. This work shows a consistent stochastic framework that allows to quantify uncertainties on these parameters which are associated with large-scale geological features. We couple stochastic adaptive sampling (i.e., particle swarm optimization) with global stochastic inversion to infer three-dimensional acoustic impedance from existing seismic reflection data. Key uncertain geological parameters are first identified, and reliable a priori distributions inferred from geological knowledge are assigned to each parameter. The type and shape of each distribution reflects the level of knowledge about this parameter. Then, particle swarm optimization is integrated as part of an iterative geostatistical seismic inversion methodology and these parameters are optimized along with the spatial distribution of acoustic impedance. At the end of the iterative procedure, we retrieve the best-fit inverse model of acoustic impedance along with the most probable value for the location of top and base of each seismic unit, the most likely histogram and variogram model per zone. We couple stochastic adaptive sampling (i.e., particle swarm optimization) with global stochastic inversion to infer three-dimensional acoustic impedance from existing seismic reflection data. Key uncertain geological parameters are first identified, and reliable a priori distributions of potential values are assigned to each parameter. The type and shape of each distribution reflects the level of knowledge about this parameter. Then, particle swarm optimization is integrated as part of an iterative geostatistical seismic inversion methodology and these parameters are optimized along with the spatial distribution of acoustic impedance. At the end of the iterative procedure we retrieve the best-fit inverse model of acoustic impedance along with the most probable value for the location of top and base of each seismic unit, the most likely histogram and variogram model per zone.
Modern seismic data surveys generate terabytes of data daily leading to a significant increase of the cost for storage and transmission. Therefore, it is desired to compress seismic data. In this work, we propose a model-based compression scheme to deal with the large data volume. First, each seismic trace is modeled as a superposition of multiple exponentially decaying sinusoidal waves (EDSWs). Each EDSW represents a model component and is defined by a set of parameters. Secondly, a parameter estimation algorithm for this model is proposed using Particle Swarm Optimization (PSO) technique. In the proposed algorithm, the parameters of each EDSW are estimated sequentially wave by wave. A suitable number of model components for each trace is determined according to the level of the residuals energy. The proposed model based compression scheme is then experimentally compared with the discrete Cosine transform (DCT) on a real seismic data. The proposed model based algorithm outperforms the DCT in term of compression ratio and reconstruction quality.
Presentation Date: Tuesday, October 16, 2018
Start Time: 1:50:00 PM
Location: Poster Station 20
Presentation Type: Poster
Summary After Fan introduced the concept of synthetic aperture (SA) to marine controlled-source electromagnetics (CSEM), numerous of optimization methods are applied to SA weights selection for improving the detectability of deeply buried targets under seabed, but few are suitable for the seriously nonlinear EM problems. This study presents an application of particle swarm optimization (PSO) to optimization of the phase shift and the amplitude compensation coefficient of SA for marine CSEM. Eigenstate analysis (EA), another nonlinear SA weights optimization method, is also carried out in marine CSEM data processing for comparison. The 3D synthetic model reconstructed from Fan (2010) is used to better demonstrate the effects of the detectability with and without two optimization algorithms. In order to validate the effectiveness of PSO, we scan all the confined weights.
The advancement in azimuthal propagation resistivity tools enables them to see much further into the formation compared to previous tools, which indicates that multiple layers earth models are necessary for inversion. However, the multilayer inversion is of tentimes highly nonlinear, and the gradient-based deterministic optimization methods such as gradient descent and Newton’s method are prone to be trapped in local optimum. Therefore, an alternative approach is demanded to cater to the advanced logging tools. In this study, a stochastic method called particle swarm optimization (PSO) is recommended for multilayer geosteering inversion due to its excellent performance in terms of global searching as well as its easy implementation and simplicity for parallelization. Two examples are provided to demonstrate the capability of PSO on geosteering applications at the end.
Presentation Date: Tuesday, October 16, 2018
Start Time: 8:30:00 AM
Location: 212A (Anaheim Convention Center)
Presentation Type: Oral
Tanaka, Shusei (Chevron Energy Technology Company) | Wang, Zhenzhen (Chevron Energy Technology Company) | Dehghani, Kaveh (Chevron Energy Technology Company) | He, Jincong (Chevron Energy Technology Company) | Velusamy, Baskar (Chevron Energy Technology Company) | Wen, Xian-Huan (Chevron Energy Technology Company)
Field development optimization for oil and gas reservoirs is typically challenging due to large number of control parameters, model complexity, as well as subsurface uncertainties. In this study, we propose a joint field development and well control optimization workflow using robust parameterization technique and demonstrate its application through a offshore oil field development.
Traditionally, using simulation models for optimization of field development plan was considered time and cost prohibitive when incorporating models to cover range of uncertainties in reservoir properties. Consequently, the problem was simplified by reducing the number of control parameters through multi-disciplinary workflows. In this paper, we aim to optimize field development strategy by simultaneously controlling topside facility, number of wells, their trajectories, drilling sequence, and completion strategy etc., considering subsurface uncertainties and constraints. To achieve this, we used our next generation reservoir simulator and commercial cloud computing to explore the possibility of achieving an optimized development scenario within reasonable time and cost constraints.
We have applied the proposed workflow to the Olympus field case, which is an optimization benchmarking problem set up by Netherland Organization for Applied Scientific Research (TNO) using a synthetic North-sea type reservoir. Our objective is to improve the net present value (NPV) after 20 years of operation by controlling the number and location of platforms, number of injectors and producers as well as their trajectories and drilling sequence. The large number of control parameters and subsurface uncertainties make the optimization process challenging. Three optimization techniques, genetic algorithm (GA), particle swarm optimization (PSO) and ensemble-based optimization (EnOpt) were tested and their performances were compared. Best results in terms of NPV improvement was obtained by using the mixed-integer Genetic Algorithm method. More than ten thousand simulation runs were required by the method to reach to optimal development of well location, trajectory, drilling sequence etc. This was made possible by utilizing a high performance parallel simulator and cloud computing. The estimated cost of the commercial cloud service is almost negligible compared with the improvement in the economic value of the optimized asset development plan. The developed workflow and parameterization technique are flexible in well trajectory configuration and completion design allowing application to primary depletion as well as waterflooding.