In this work we discuss the successful application of our previously developed automated scenario reduction approach applied to life-cycle optimization of a real field case. The inherent uncertainty present in the description of reservoir properties motivates the use of an ensemble of model scenarios to achieve an optimized robust reservoir development strategy. In order to accurately span the range of uncertainties it is imperative to build a relatively large ensemble of model scenarios. The size of the ensemble is directly proportional to the computational effort required in robust optimization. For high-dimensional, complex field case models this implies that a large ensemble of model scenarios which albeit accurately captures the inherent uncertainties would be computationally infeasible to be utilized for robust optimization. One of the ways to circumvent this problem is to work with a reduced subset of model scenarios. Methods based on heuristics and ad-hoc rules exist to select this reduced subset. However, in most of the cases, the optimal number of model realizations must be known upfront. Excessively small number of realizations may result in a subset that does not always capture the span of uncertainties present, leading to sub-optimal optimization results. This raises the question on how to effectively select a subset that contains an optimal number of realizations which both is able to capture the uncertainties present and allow for a computationally efficient robust optimization. To answer this question we have developed an automated framework to select the reduced ensemble which has been applied to an original ensemble of 300 equiprobable model scenarios of a real field case. The methodology relies on the fact that, ideally, the distance between the cumulative distribution functions (CDF) of the objective function (OF) of the full and reduced ensembles should be minimal. This allows the method to determine the smallest subset of realizations that both spans the range of uncertainties and provides an OF CDF that is representative of the full ensemble based on a statistical metric. In this real field case application we optimize the injection rates throughout the assets life-cycle with expected cumulative oil production as the OF. The newly developed framework selected a small subset of 17 model scenarios out of the original ensemble which was used for robust optimization. The optimal injection strategy achieved an average increase of 6% in cumulative oil production with a significant reduction, approximately 90%, in the computational effort. Validation of this optimal strategy over the original ensemble lead to very similar improvements in cumulative oil production, highlighting the reliability and accuracy of our framework.
There are a vast number of reservoirs with drill cuttings and core images that have classification problems associated with them. This could be due to the images not being classified in the first place, or the images may be available but the interpretation reports could be missing. Another problem is that images from different wells could be interpreted by different wellsite geologists/sedimentologists and hence result in an inconsistent classification scheme. Finally, there could also be the problem of some images being incorrectly classified. Ergo it would be desirable to have an unbiased objective system that could overcome all of these issues. Step in convolutional neural networks. Advances during this decade in using convolutional neural networks for visual recognition of discriminately different objects means that now object recognition can be achieved to a significant extent. Once the network is trained on a representative set of lithological classes, then such a system just needs to be fed the raw drill cuttings or core images that it has not seen before and it will automatically assign a lithological class to each image and an associated probability of the image belonging to that class. In so doing, images below a certain probability threshold can be automatically flagged for further human investigation. The benefit of such a system would be to improve reservoir understanding by having all available images classified in a consistent manner hence keeping the characterization consistent as well. It would further help to reduce the time taken to get human expertise to complete the task, as well as the associated cost.
Al-Farisi, Omar (Khalifa University of Science and Technology) | Zhang, Hongtao (Khalifa University of Science and Technology) | Raza, Aikifa (Khalifa University of Science and Technology) | Ozzane, Djamel (ADNOC) | Sassi, Mohamed (Khalifa University of Science and Technology) | Zhang, TieJun (Khalifa University of Science and Technology)
Automated image processing algorithms can improve the quality and speed of classifying the morphology of heterogeneous carbonate rock. Several commercial products have worked to produce petrophysical properties from 2D images and with less extent from 3D images, relying on image processing and flow simulation. Images are mainly micro-computed tomography (μCT), optical images of thin-section, or magnetic resonance images (MRI). However, most of the successful work is from the homogeneous and clastic rocks. In this work, we have demonstrated a Machine Learning assisted Image Recognition (MLIR) approach to determine the porosity and lithology of heterogeneous carbonate rock by analyzing 3D images form μCT and MRI. Our research method consists of two parts: experimental and MLIR. Experimentally, we measured porosity of rock core plug with three different ways: (i) weight difference of dry and saturated rock, (ii) NMR T2 relaxation of saturated rock, and (iii) helium gas injection of rock after cleaning and drying.
We performed MLIR on 3D μCT and MRI images using random forest machine-learning algorithm. Petrophysicist provided a set of training data with classes (i.e., limestone, pyrite, and pore) as expert knowledge of μCT Image intensity correspondence to petrophysical properties. MLIR performed, alone, each task for identifying different lithology types and porosity. Determined volumes have been checked and confirmed with three different experimental datasets. The measured porosity, from three experiment-based approaches, is very close. Similarly, the MLR measured porosity produced excellent results comparatively with three experimental measurements, with an accuracy of 97.1% on the training set and 94.4% on blind test prediction.
Waterflooding is the main technic to recover hydrocarbons in reservoirs. For a given set of wells (injectors and producers), the choice of injection/production parameters such as pressures, flow rates, and locations of these boundary conditions have a significant impact on the operating life of the wells. As a large number of combinations of these parameters are possible, one of the critical decision to make is to identify an optimal set of these parameters. Using the reservoir simulator directly to evaluate the impact of these sets being unrealistic considering the required number of simulations, a common approach consists of using response surfaces to approximate the reservoir simulator outputs. Several techniques involving proxies model (e.g., kriging, polynomial, and artificial neural network) have been suggested to replace the reservoir simulations. This paper focalizes on the application of artificial neural networks (ANN) as it is commonly admitted that the ANNs are the most efficient one due to their universal approximation capacity, i.e., capacity to reproduce any continuous function. This paper presents a complete workflow to optimize well parameters under waterflooding using an artificial neural network as a proxy model. The proposed methodology allows evaluating different production configurations that maximize the NPV according to a given risk. The optimized solutions can be analyzed with the efficient frontier plot and the Sharpe ratios. An application of the workflow to the Brugge field is presented in order to optimize the waterflooding strategy.
Al-Jenaibi, Faisal (ADNOC - Upstream) | Shelepov, Konstantin (Rock Flow Dynamics) | Kuzevanov, Maksim (Rock Flow Dynamics) | Gusarov, Evgenii (Rock Flow Dynamics) | Bogachev, Kirill (Rock Flow Dynamics)
The application of intelligent algorithms that use clever simplifications and methods to solve computationallycomplex problems are rapidly displacing traditional methods in the petroleum industry. The latest forward-thinking approaches inhistory matching and uncertainty quantification were applied on a dynamic model that has unknown permeability model. The original perm-poro profile was constructed based on synthetic data to compare Assisted History Matching (AHM)approach to the exact solution. It is assumed that relative permeabilities, endpoints, or any parameter other than absolute permeability to match oil/water/gas rates, gas-oil ratio, water injection rate, watercut and bottomhole pressure cannot be modified.
The standard approach is to match a model via permeability variation is to split the grid into several regions. However, this process is a complete guess as it is unclear in advance how to select regions. The geological prerequisites for such splitting usually do not exist. Moreover, the values of permeability and porosity in different grid blocks are correlated. Independent change of these values for each region distortscorrelations or make the model unphysical.
The proposed alternative involves the decomposition of permeability model into spectrum amplitudes using Discrete Cosine Transformation (DCT), which is a form of Fourier Transform. The sum of all amplitudes in DCT is equal to the original property distribution. Uncertain permeability model typically involves subjective judgment, and several optimization runs to construct uncertainty matrix. However, the proposed multi-objective Particle Swarm Optimization (PSO) helps to reduce randomness and find optimal undominated by any other objective solution with fewer runs. Further optimization of Flexi-PSO algorithm is performed on its constituting components such as swarm size, inertia, nostalgia, sociality, damping factor, neighbor count, neighborliness, the proportion of explorers, egoism, community and relative critical distance to increase the speed of convergence. Additionally, the clustering technique, such as Principal Component Analysis (PCA), is suggested as a mean to reduce the space dimensionality of resulted solutions while ensuring the diversity of selected cluster centers.
The presentedset of methodshelps to achieve a qualitative and quantitative match with respect to any property, reduce the number of uncertainty parameters, setup ageneric and efficient approach towards assisted history matching.
With the advent of high-resolution methods to predict hydraulic fracture geometry and subsequent production forecasting, characterization of productive shale volume and evaluating completion design economics through science-based forward modeling becomes possible. However, operationalizing a simulation-based workflow to optimize design to keep up with the field operation schedule remains the biggest challenge owing to the slow model-to-design turnaround cycle. The objective of this project is to apply the ensemble learning-based model concept to this issue and, for the purpose of completion design, we summarize the numerical-model-centric unconventional workflow as a process that ultimately models production from a well pad (of multiple horizontal laterals) as a function of completion design parameters. After the development and validation and analysis of the surrogate model is completed, the model can be used in the predictive mode to respond to the "what if" questions that are raised by the reservoir/completion management team.
The traditional trial and error approach of history matching to obtain an accurate model requires engineers to control each uncertain parameter and can be quite time consuming and inefficient. However, automatic history matching (AHM), assisted by computers, is an efficient process to control a large number of parameters simultaneously by an algorithm that integrates a static model with dynamic data to minimize a misfit for improving reliability. It helps to reduce simulation run time as well.
Particle Swarm Optimization (PSO) is a population based stochastic algorithm that can explore parameter space combined with the least squares single objective function. The process of AHM can adopt parameterization and realization methods to reduce inverse problems. In this study, realizations of various reservoir properties such as porosity, net to gross, relative permeability, horizontal and vertical permeability, and aquifer size were chosen for controlling throughout the AHM. History matching was conducted to validate the efficiency of each method. The guidelines for optimized AHM with a stochastic algorithm are also disccussed.
The realization and parameterization methods improved matching results in a full-field application with resulting in a reduced misfit and in less. A stochastic algorithm generates multiple models to deduce control parameters to reduce a misfit. In this study we identified that PSO converged effectively with updated control parameters. The optimized AHM improved the accuracy of a full-field model although some misfit remained in the match to bottomhole pressure.
We found that updating with too many parameters makes the problem difficult to solve while using too few leads to false convergence. In addition, while the simulation run time is critical, a full-field simulation model with reduced computational overhead is benefitial.
In this study, we observed that the PSO was an efficient algorithm to update control parameters to reduce a misfit. Using the parameterization and realization as an assisted method helped find better results. Overall this study can be used as a guideline to optimize the process of history matching.
While Image processing is still an area of research, standard workflows have emerged and are routinely used in Oil&Gas companies.
However, while hardware capabilities have increased consequently, allowing large samples to be scanned with a high fidelity, permeability simulations are still limited to small samples unless having access to HPC. Direct simulations are known to be more flexible in terms of type of rocks, but limited in terms of sample size, while Pore Network Model based allow much larger sample sizes but less rock types.
In this study, we will focus on the pore space analysis of a middle-eastern carbonate sample. The rock sample is 7.5 cm tall and has a diameter of 3.8 cm.
It has been acquired at 3 different resolution: a microCT scan at 16μm, a microCT scan of a 10 mm of diameter subsample at 5 μm, and a 10 mm of diameter SEM section at 2μm.
This study will propose a methodology to mix the different scales in order to get an accurate pore space analysis of the largest possible sample size.
As micro porous regions are visible at every scale, bringing uncertainty to the segmentation step, the first part of our analysis will consist of determining the most accurate pore space at the three different resolutions. We will rely on image registration (2D to 3D and 3D to 3D) and image based upscaling methods, further validated by simulation results.
Given the large numerical size of the samples, specific workflows involving large data 3D visualization and processing will be presented.
Then, different measures will be conducted: porosity and connected porosity, absolute permeability with three different methods (Lattice Boltzmann, Finite Volume, Pore Network Modeling), relative permeability curves using a Pore Network Model simulator. A new pore network model generation applicable to highly concave pore spaces such as carbonates ones will also be introduced.
A scalable method using automation will be presented, so that repeating the simulations on different samples of different space origins and size is easy.
We will expose the results and limits of every method and will determine which size is bringing a convergence of the results. We will especially look at the convergence of direct based simulations and pore network model based ones, such that expanding the size prior to Pore Network Model generation can be reliable.
In addition to the benchmark of the different simulation methods and their associated limits, the results will help us determining the representative elementary volume at different resolutions and the associated uncertainty depending on whether sub-resolution acquisitions are available or not.
microCTs and SEM image of the carbonate rock sample
Standard approaches to optimization under uncertainty in reservoir simulation require use of multiple realizations, with variable parameters representing operational constraints and actions as well as uncertain scenarios. We will show how appropriate use of local optimization within the simulation model, using customized logic for field management strategies, can bring improved workflow flexibility and efficiency, by reducing the effort needed for uncertainty iterations.
To achieve meaningful forecasts for an ensemble of uncertain scenarios, it is important to distinguish between different types of decision. Investment decisions, such as facilities sizing, depend on global unknowns and must be optimized for the complete ensemble. Operational actions, such as closing a valve, can be optimized instantaneously for individual scenarios, using measurable information, although subject to constraints determined at a global level. In this study, we implement local optimization procedures within simulation cases, combining customized objective criteria to rank reactive or proactive actions, with the ability to query reservoir flow entities at appropriate frequencies.
The methods presented in the paper can be used for reactive response modeling for smart downhole control; optimization of ESP/PCP pump performance; and implementation of production plans subject to defined downstream limits. For selected cases, we compare the advantages and disadvantages of the local optimization approach with standardized "big-loop" uncertainty workflows. The methodology can significantly reduce optimization costs, particularly for high-frequency actions, achieving similar objective function values in a fraction of the time needed for post-processing optimizers. Use of tailored scripting provides the capability to modernize the logic framework for field management decisions, with realistic representation of smart field equipment and flow entities at any level of complexity.
Use of efficient workflows as described in this paper can reduce the cost of multiple realization studies significantly, or enable engineers to consider a wider range of possible scenarios, for deeper understanding and better risk mitigation.
Masini, Cristian (Petroleum Development Oman) | Al Shuaili, Khalid Said (Petroleum Development Oman) | Kuzmichev, Dmitry (Leap Energy) | Mironenko, Yulia (Leap Energy) | Majidaie, Saeed (Formerly with Leap Energy) | Buoy, Rina (Formerly with Leap Energy) | Alessio, Laurent Didier (Leap Energy) | Malakhov, Denis (Target Oilfield Services) | Ryzhov, Sergey (Formerly with Target Oilfield Services) | Postuma, Willem (Target Oilfield Services)
Unlocking the potential of existing assets and efficient production optimisation can be a challenging task from resource and technical execution point of view when using traditional static and dynamic modelling workflows making decision-making process inefficient and less robust.
A set of modern techniques in data processing and artificial intelligence could change the pattern of decision-making process for oil and gas fields within next few years. This paper presents an innovative workflow based on predictive analytics methods and machine learning to establish a new approach for assets management and fields’ optimisation. Based on the merge between classical reservoir engineering and Locate-the-Remaining-Oil (LTRO) techniques combined with smart data science and innovative deep learning algorithms this workflow proves that turnaround time for subsurface assets evaluation and optimisation could shrink from many months into a few weeks.
In this paper we present the results of the study, conducted on the Z field located in the South of Oman, using an efficient ROCM (Remaining Oil Compliant Mapping) workflow within an advanced LTRO software package. The goal of the study was to perform an evaluation of quantified and risked remaining oil for infill drilling and establish a field redevelopment strategy.
The resource in place assessment is complemented with production forecast. A neural network engine coupled with ROCM allowed to test various infill scenarios using predictive analytics. Results of the study have been validated against 3D reservoir simulation, whereby a dynamic sector model was created and history matched.
Z asset has a number of challenges starting from the fact that for the last 25 years the field has been developed by horizontal producers. The geological challenges are related to the high degree of reservoir heterogeneity which, combined with high oil viscosity, leads to water fingering effects. These aspects are making dynamic modelling challenging and time consuming.
In this paper, we describe in details the workflow elements to determine risked remaining oil saturation distribution, along with the results of ROCM and a full-field forecast for infill development scenarios by using neural network predictive analytics validated against drilled infills performance.