The petro-elastic model (PEM) represents an integral component in the closed-loop calibration of integrated four-dimensional (4D) solutions incorporating time-lapse seismic, elastic and petrophysical rock property modeling, and reservoir simulation. Calibration of the reservoir simulation model is needed so that it is not only consistent with production history but also with the contemporaneous subsurface description as characterized by time-lapse seismic. The PEM requires dry rock properties in its description, which are typically derived from mechanical rock tests. In the absence of those mechanical tests, a small data challenge is posed, whereby all necessary data is not available but the value of reconciling seismic attributes to simulated production remains. A seismic inversion-constrained n-dimensional metaheuristic optimization technique is employed directly on three-dimensional (3D) geocellular arrays to determine elastic and density properties for the PEM embedded in the commercial reservoir simulator.
Ill-posed dry elastic and density property models are considered in a field case where the seismic inversion and petrophysical property model constrained by seismic inversion exist. An n-dimensional design optimization technique is implemented to determine the optimal solution of a multidimensional pseudo-objective function comprised of multidimensional design variables. This study investigates the execution of a modified particle swarm optimization (PSO) method combined with an exterior penalty function (EPF) with varied constraints. The proposed technique involves using n-dimensional design optimization to solve the pseudo-objective function comprised of the PSO and EPF given limited availability of constraints. In this work, an examination of heavily and reduced-order penalized metaheuristic optimization processes, where the design variables and optimal solution are derived from 3D arrays, is conducted so that constraint applicability is quantified. While the process is examined specifically for PEM, it can be applied to other data-limited modeling techniques.
Integration of time-lapse seismic data into dynamic reservoir model is an efficient process in calibrating reservoir parameters update. The choice of the metric which will measure the misfit between observed data and simulated model has a considerable effect on the history matching process, and then on the optimal ensemble model acquired. History matching using 4D seismic and production data simultaneously is still a challenge due to the nature of the two different type of data (time-series and maps or volumes based).
Conventionally, the formulation used for the misfit is least square, which is widely used for production data matching. Distance measurement based objective functions designed for 4D image comparison have been explored in recent years and has been proven to be reliable. This study explores history matching process by introducing a merged objective function, between the production and the 4D seismic data. The proposed approach in this paper is to make comparable this two type of data (well and seismic) in a unique objective function, which will be optimised, avoiding by then the question of weights. An adaptive evolutionary optimisation algorithm has been used for the history matching loop. Local and global reservoir parameters are perturbed in this process, which include porosity, permeability, net-to-gross, and fault transmissibility.
This production and seismic history matching has been applied on a UKCS field, it shows that a acceptalbe production data matching is achieved while honouring saturation information obtained from 4D seismic surveys.
Time-lapse seismic monitoring is a powerful technique for reservoir management and the optimization of hydrocarbon recovery. In time-lapse seismic datasets, the difference in seismic properties across different vintages enables the detection of spatio-temporal changes in saturated properties and structure induced by production. The main objectives are (1) to identify bypass pay zones in time-lapse seismic data for the deepwater Amberjack field, located in the Gulf of Mexico, (2) confirm the identified bypass pay zones in the results of reservoir simulation, and (3) recommend well planning strategies to exploit these bypassed resources.
A high-fidelity seismic-to-simulation 4D workflow that incorporates seismic, petrophysics, petrophysical property modeling, and reservoir simulation was employed, which leveraged cross-discipline interaction, interpretation, and integration to extend asset management capabilities. The workflow addresses geology (well log interpretation and framework development), geophysics (seismic interpretation, velocity modeling, and seismic inversion), and petrophysical property modeling (earth models and co-located co-simulation of petrophysical properties with P-impedance from seismic inversion). An embedded petro-elastic model (PEM) in the reservoir simulator is then used to affiliate spatial dry rock properties with saturation properties to compute dynamic elastic properties, which can be related to multi-vintage P-impedance from time-lapse seismic inversion. In the absence of the requisite dry rock properties for the PEM, a small data engine is used to determine these absent properties using metaheuristic optimization techniques. Specifically, two particle swarm optimization (PSO) applications, including an exterior penalty function (EPF), are modified resulting in the development of nested and average methods, respectively. These methods simultaneously calculate the missing rock parameters (dry rock bulk modulus, shear modulus, and density) necessary for dynamic, embedded P-impedance calculation in the history-constrained reservoir simulation results. Afterward, a graphic-enabled method was devised to appropriately classify the threshold to discriminate non-reservoir (including bypassed pay) and reservoir from the P-impedance difference. Its results are compared to unsupervised learning (k-means clustering and hierarchical clustering). From seismic data, one can identify bypassed pay locations, which are confirmed from reservoir simulation after conducting a seismic-driven history match. Finally, infill wells are planned, and then modeled in the reservoir simulator.
Probabilistic and deterministic methods for reserves and resources evaluation are commonly used in isolation and often considered mutually exclusive. Subsurface uncertainties are critical factors impacting projects and reserves/resources especially in projects/areas where large sums of capital investment are required. Probabilistic methods allow a rigorous use of information on the ranges of uncertainty on key reservoir parameters like porosity, water saturation, permeability aquifer size for reserves estimation. A key output of probabilistic methods is the confidence levels associated with the reserves. Deterministic methods can't provide confidence levels associated with reserves and resources assessments reason why with its successful application is often relaying on the expert knowledge of the evaluator and the strict use of reserves or resources definitions. Technological advances in computing in the last decades have played a key role in advancing computationally intensive probabilistic methodologies including artificial intelligence. These advances have allowed integrated teams to perform studies using sophisticated workflows in feasible timeframes.
Chemical enhanced oil recovery (EOR) methods have received increased attention in recent years since they have the ability to recover the capillary trapped oil. Successful chemical flooding application requires accurate numerical models and reliable forecast across multiple scales: core scale, pilot scale, and field scale. History matching and optimization are two key steps to achieve this goal.
For history matching chemical floods, we propose a general workflow for multi-stage model calibration using an Evolutionary Algorithm. A comprehensive chemical flooding simulator is used to model important physical mechanisms including phase behavior, cation exchange, chemical and polymer adsorption and capillary desaturation. First, we identify dominant reservoir and process parameters based on a sensitivity analysis. The history matching is then carried out in a stage-wise manner whereby the most dominant parameters are calibrated first and additional parameters are incorporated sequentially until a satisfactory data misfit is achieved. Next, a diverse subset of history matched models is selected for optimization using a Pareto-based multi-objective optimization approach. Based on the concept of dominance, Pareto optimal solutions are generated representing the trade-off between increasing oil recovery while improving the efficiency of chemical usage. These solutions are searched using a Non-dominated Sorting Genetic Algorithm (NSGA-II). Finally we implement a History Matching Quality Index (HMQI) with Moving Linear Regression Analysis to evaluate simulation results from history matching process. The HMQI provides normalized values for all objective functions having different magnitude and leads to a more consistent and robust approach to evaluate the updated models through model calibration.
Multistage hydraulic fracturing of a horizontal well in an unconventional reservoir tends to induce a complex fracture network (CFN) which is challenging to characterize by conventional methods. In this work, we develop a fracture characterization workflow to estimate the geometric configuration and fracture properties of a CFN by assimilating microseismic event data and production data, sequentially.
A novel stochastic fractal model, that is consistent with rock physics and outcrop observations, is developed in order to generate realizations of the complex fracture network. In the first stage of the two-stage assisted history matching workflow, we estimate the parameters of the stochastic fractal model (fracture intensity, average fracture length, orientation and fracture distribution) by using a genetic algorithm to history match data for the locations of microseismic events. In the second stage, the production data from the shale reservoir are assimilated by the ES-MDA algorithm to estimate the stimulated reservoir volume (SRV) and its average permeability, fracture permeability, aperture and porosity. In the unconventional shale gas reservoir simulator used as the forward model, large-scale fractures are modeled via the embedded discrete fracture model (EDFM) and a dual-porosity, dual-permeability (DP-DK) model is used for modeling the SRV and small scale fractures. The simulator includes Knudsen diffusion and the Langmuir adsorption/desorption model.
For validation, we consider a synthetic shale gas reservoir with a horizontal well that has been stimulated by multistage hydraulic fracturing. A particular realization of the variables that describe the reservoir model is used to generate observed data for microseismic events and production rates. The parameters to be adjusted to match the observed microseismic events are the expected values of the length, orientation and intensity of the distribution of the natural fractures and the fractal pattern. Results show that we obtain good estimates of the expected value of natural fracture length, orientation, intensity and fracture distribution by history matching observations of locations of microseismic events. These estimates provide an updated stochastic fractal model for the configuration of CFN. The history-matched fractal model is used to generate an ensemble of fracture distributions consistent with microseismic data as candidate fracture configurations when estimating fracture properties by matching production data. We obtain much better history matches, future performance predictions, estimates of stimulated reservoir volume and its average permeability and estimates of fracture permeability, porosity and aperture when we match both microseismic and production data than we only match production data. When both seismic and production data are matched for synthetic cases and parameters are properly scaled, the true values of parameters and reservoir performance predictions are within the P25-P75 confidence intervals calculated from the ensemble of history matched models in virtually all cases.
In practice, the proper characterization of the CFN and reservoir properties should be useful for placing new wells and designing fracture treatments.
Geophysical Reservoir Monitoring GRM systems such 4D seismic are increasingly used in the oil and gas industry because they provide unique and useful information on fluid movement within the reservoir. This information is relevant for many reservoir management decisions; including new well placement, well intervention, and reservoir model updating.
Unfortunately, it has been difficult to estimate the value creation of any data acquisition scheme due to the fact that a multidisciplinary approach is required to model the value that future measurements will imply in future decisions. This assessment requires a common decision making simulation frame work that can integrate the input from geo-modelers, geophysicist and reservoir engineers.
This work presents an example of how a Close Loop Reservoir Management (CLRM) simplification can be used as a framework for simulating NPV changes due to assimilation of production and saturations in a simple toy model. It combines state-of-the-art data assimilation and uncertainty modeling methods with a robust optimization genetic algorithm to calculate NPV improvements due to model update and its relationship with the NPV obtained from the synthetic reservoir.
In this context a simple synthetic model is presented. It recreates a segment of green field under a strong aquifer influence with two discovery wells. The reservoir development requires the selection of 4 well locations at fixed drilling times. The development strategy selection is obtained with the use of a genetic algorithm within the CLRM framework. Subsequently two cases are presented: one of assimilating only production after the first two wells have been drilled, just before deciding the locations of the last two wells; and a second case, in which production and saturation are assimilated at the same time. The saturation map assimilated is assumed to be output of a 4D seismic acquisition. The model update imposes the need of optimally relocate the last two wells which results in a NPV change.
The results show how the obtained NPVs is incremented by the relocation of the last two wells in both cases. A bigger increment is obtained when both, production and saturation are assimilated. In addition, the ensemble improved its forecast capability the most, when saturation assimilation is included. Nevertheless, the ensemble expected NPV decreases after assimilation from the value obtained from the first development strategy optimization; this indicates an optimistic early NPV valuation due to the initial ensemble distributions spread.
The study presents an asset simulation framework that could be used to evaluate data acquisition investments through the systematic modeling of reservoir uncertainties with in a decision oriented focus. This could include the inclusion of additional uncertain model parameters, the insertion of water injector and well conversions, the assimilation of saturations at different intervals, the change on the quality of the saturation maps assimilated, in addition to sensitivity studies of other economic constrains.
Coupled reservoir flow and geomechanics has numerous important applications in the oil & gas industry, such as land subsidence, hydraulic fracturing, fault reaction and hydrocarbon recovery etc. High fidelity numerical schemes and multiphysics models must be coupled in order to simulate these processes and their interactions accurately and efficiently. Specifically, in the applications of CO2 sequestration, the effect of geomechanics on carbon storage estimation is not negligible. However, coupled flow-geomechanics simulations are very computationally expensive and most of the computational time is usually spent for geomechanics calculations. This paper investigates a three-way coupling algorithm that uses an error indicator to determine when displacement must be updated and whether fixed-stress iterative coupling technique is required. Numerical experiments with coupled nonlinear single-phase flow and linear poromechanics shows that the three-way coupling algorithm can speed up 4 times comparing to fixed-stress iterative coupling algorithm. Extensions to coupled compositional flow with poromechanics also shows a speed-up for 5 times for continuous CO2 sequestration applications and 2 times for surfactant-alternating-gas applications (SAG). The substantial speed up makes the three-way coupling algorithm of flow and geomechanics feasible in the large-scale optimizations. Based on the three-way coupling of compositional flow and geomechanics, we experimented two black box optimization algorithms, covariance-matrix adaptation evolution strategy (CMA-ES) and genetic algorithm (GA), for the optimization of well controls during SAG process to maximize CO2 storage volume. CMA-ES outperforms GA in that it is more robust, and it achieves higher objective function value in less simulation runs. The optimized SAG process achieves 27.55% more CO2 storage volume and reduces water and surfactant consumption by 54.84%.
Motie, Mohadeseh (Tarh O Palayesh Engineering Co.) | Moein, Peyman (Tarh O Palayesh Engineering Co.) | Moghadasi, Ramin (Department of Earth Sciences, Uppsala University) | Hadipour, Ali (Tarh O Palayesh Engineering Co.)
Once reservoir fluid is brought to the surface it is separated to discrete streams of oil and gas in production unit. This should be conducted with consideration of maximizing oil production and to decreasing total costs as well. The regulation of separators pressure is a key point to achieve the highest oil recovery in this process. In addition, estimating the optimum number of separation stages, would augment the design of production plant economically. Adjusting separator pressure is usually conducted by employing PVT data which are provided through flash or equilibrium separation tests. In case of experimental tests unavailability, an alternative method could be empirical correlations. However, such tentative methods contain inaccuracy and are also time consuming which may limit their utility. In this study the issue was addressed by applying genetic algorithm (GA) as well as flash calculation through a case study on one of Iranian oil field. It has been tried to optimize set point pressures of each stage and enhance liquid share of the fluid with respect to minimized gas oil ratio (GOR). Prior to this, four scenarios were considered for determining the number of separation stages. To determine the prime scenario, a cost evaluation was implemented which leaded to the selection of four-separator scenario as the most economical case.
Sailboat routing consists in computing the best route for a sailboat taking into account the characteristics of the ship and environmental data such as weather forecast. In the context of sailing races, the best route computation is usually based on the isochrone algorithm, a sub-optimal solution to optimize the time to destination (TtD) criterion by computing a route as a sequence of waypoints. In this paper, we propose to compute a set of possible routes by considering two criteria: the time to destination and the stress. The time to destination is evaluated according to weather forecast and boat polar diagrams. The stress function is a combination of human and environmental factors. The set of possible routes are then obtained by using an iterative multiple objective optimization algorithm. Isochrone algorithm is used for initializing the set of routes. Then mutation operators are used to explore alternative solutions. Applied to realistic test cases, our search strategy allows to obtain routes with very different characteristics in terms of time to destination and stress values, asserted by experimented sailors. Concerning the main objective of minimizing time to destination, we are competitive with commercial software such as MaxSea or Adrena.