A reservoir simulation study of different fishbone well designs performance compared to a base development well design of extra-long maximum reservoir contact (MRC) single lateral wells is presented. The objective is to compare different well design concepts in a waterflood recovery scheme to achieve production target rate and maximize resource value for economic development of an undeveloped tight carbonate reservoir. The studied reservoir is located in a giant offshore oil field in the Middle East and was used as a representation of the different tight reservoirs within the field. It is characterized by poor quality rocks with a permeability trending from 2 – 0.5 md in a SE – NW direction.
The study compromises an assessment of the achievable initial maximum oil rate, volumetric reservoir sweep and expected ultimate oil recovery factor for different well design concepts for a base short well spacing utilized for effective pressure support. In addition to that, the impact of fishbone well design on well count reduction potential utilizing twice the base short well spacing compared to single lateral wells development design utilizing the base short well spacing was evaluated.
A sector model with equal producer to injector ratio was used with refined gridding to wells and bulk area gridded with a cell size of 10 m by 10 m in a representative area of the reservoir. The modeled wells incorporated with vertical flow performance tables with gas lift capabilities. The analysis also incorporated generating streamlines for analyzing fishbone well designs areal reservoir sweep and an examination of remaining movable oil areal distribution. An assessment matrix was formulated for comparing extra-long MRC single laterals base development design versus different fishbone well designs. The assessment matrix incorporated in addition to reservoir related flow performance indicators: drilling complexity and well cost, well life cycle activities, etc. for a comprehensive assessment.
The main findings show that fishbone well designs have complicated areal sweep performance, especially with sealed motherbore, that result in a lower oil recovery factor with higher hydrocarbon pore volume injected and water oil ratio compared to extra-long MRC single laterals. Also, fishbone well designs have serious limitations during well life cycle activities compared to extra-long MRC single lateral design in terms of stimulation, well accessibility and well intervention options making the extra-long MRC single laterals the preferred field development concept within tight reservoirs especially with the base short well spacing. Finally, the analysis has shown that Fishbone well designs can’t reduce the well count since base short well spacing is still needed for effective pressure support by water injection in addition to maximizing the oil recovery factor within the field life time and building and sustaining the target plateau.
In this work we discuss the successful application of our previously developed automated scenario reduction approach applied to life-cycle optimization of a real field case. The inherent uncertainty present in the description of reservoir properties motivates the use of an ensemble of model scenarios to achieve an optimized robust reservoir development strategy. In order to accurately span the range of uncertainties it is imperative to build a relatively large ensemble of model scenarios. The size of the ensemble is directly proportional to the computational effort required in robust optimization. For high-dimensional, complex field case models this implies that a large ensemble of model scenarios which albeit accurately captures the inherent uncertainties would be computationally infeasible to be utilized for robust optimization. One of the ways to circumvent this problem is to work with a reduced subset of model scenarios. Methods based on heuristics and ad-hoc rules exist to select this reduced subset. However, in most of the cases, the optimal number of model realizations must be known upfront. Excessively small number of realizations may result in a subset that does not always capture the span of uncertainties present, leading to sub-optimal optimization results. This raises the question on how to effectively select a subset that contains an optimal number of realizations which both is able to capture the uncertainties present and allow for a computationally efficient robust optimization. To answer this question we have developed an automated framework to select the reduced ensemble which has been applied to an original ensemble of 300 equiprobable model scenarios of a real field case. The methodology relies on the fact that, ideally, the distance between the cumulative distribution functions (CDF) of the objective function (OF) of the full and reduced ensembles should be minimal. This allows the method to determine the smallest subset of realizations that both spans the range of uncertainties and provides an OF CDF that is representative of the full ensemble based on a statistical metric. In this real field case application we optimize the injection rates throughout the assets life-cycle with expected cumulative oil production as the OF. The newly developed framework selected a small subset of 17 model scenarios out of the original ensemble which was used for robust optimization. The optimal injection strategy achieved an average increase of 6% in cumulative oil production with a significant reduction, approximately 90%, in the computational effort. Validation of this optimal strategy over the original ensemble lead to very similar improvements in cumulative oil production, highlighting the reliability and accuracy of our framework.
There are a vast number of reservoirs with drill cuttings and core images that have classification problems associated with them. This could be due to the images not being classified in the first place, or the images may be available but the interpretation reports could be missing. Another problem is that images from different wells could be interpreted by different wellsite geologists/sedimentologists and hence result in an inconsistent classification scheme. Finally, there could also be the problem of some images being incorrectly classified. Ergo it would be desirable to have an unbiased objective system that could overcome all of these issues. Step in convolutional neural networks. Advances during this decade in using convolutional neural networks for visual recognition of discriminately different objects means that now object recognition can be achieved to a significant extent. Once the network is trained on a representative set of lithological classes, then such a system just needs to be fed the raw drill cuttings or core images that it has not seen before and it will automatically assign a lithological class to each image and an associated probability of the image belonging to that class. In so doing, images below a certain probability threshold can be automatically flagged for further human investigation. The benefit of such a system would be to improve reservoir understanding by having all available images classified in a consistent manner hence keeping the characterization consistent as well. It would further help to reduce the time taken to get human expertise to complete the task, as well as the associated cost.
Al-Farisi, Omar (Khalifa University of Science and Technology) | Zhang, Hongtao (Khalifa University of Science and Technology) | Raza, Aikifa (Khalifa University of Science and Technology) | Ozzane, Djamel (ADNOC) | Sassi, Mohamed (Khalifa University of Science and Technology) | Zhang, TieJun (Khalifa University of Science and Technology)
Automated image processing algorithms can improve the quality and speed of classifying the morphology of heterogeneous carbonate rock. Several commercial products have worked to produce petrophysical properties from 2D images and with less extent from 3D images, relying on image processing and flow simulation. Images are mainly micro-computed tomography (μCT), optical images of thin-section, or magnetic resonance images (MRI). However, most of the successful work is from the homogeneous and clastic rocks. In this work, we have demonstrated a Machine Learning assisted Image Recognition (MLIR) approach to determine the porosity and lithology of heterogeneous carbonate rock by analyzing 3D images form μCT and MRI. Our research method consists of two parts: experimental and MLIR. Experimentally, we measured porosity of rock core plug with three different ways: (i) weight difference of dry and saturated rock, (ii) NMR T2 relaxation of saturated rock, and (iii) helium gas injection of rock after cleaning and drying.
We performed MLIR on 3D μCT and MRI images using random forest machine-learning algorithm. Petrophysicist provided a set of training data with classes (i.e., limestone, pyrite, and pore) as expert knowledge of μCT Image intensity correspondence to petrophysical properties. MLIR performed, alone, each task for identifying different lithology types and porosity. Determined volumes have been checked and confirmed with three different experimental datasets. The measured porosity, from three experiment-based approaches, is very close. Similarly, the MLR measured porosity produced excellent results comparatively with three experimental measurements, with an accuracy of 97.1% on the training set and 94.4% on blind test prediction.
Waterflooding is the main technic to recover hydrocarbons in reservoirs. For a given set of wells (injectors and producers), the choice of injection/production parameters such as pressures, flow rates, and locations of these boundary conditions have a significant impact on the operating life of the wells. As a large number of combinations of these parameters are possible, one of the critical decision to make is to identify an optimal set of these parameters. Using the reservoir simulator directly to evaluate the impact of these sets being unrealistic considering the required number of simulations, a common approach consists of using response surfaces to approximate the reservoir simulator outputs. Several techniques involving proxies model (e.g., kriging, polynomial, and artificial neural network) have been suggested to replace the reservoir simulations. This paper focalizes on the application of artificial neural networks (ANN) as it is commonly admitted that the ANNs are the most efficient one due to their universal approximation capacity, i.e., capacity to reproduce any continuous function. This paper presents a complete workflow to optimize well parameters under waterflooding using an artificial neural network as a proxy model. The proposed methodology allows evaluating different production configurations that maximize the NPV according to a given risk. The optimized solutions can be analyzed with the efficient frontier plot and the Sharpe ratios. An application of the workflow to the Brugge field is presented in order to optimize the waterflooding strategy.
Al-Jenaibi, Faisal (ADNOC - Upstream) | Shelepov, Konstantin (Rock Flow Dynamics) | Kuzevanov, Maksim (Rock Flow Dynamics) | Gusarov, Evgenii (Rock Flow Dynamics) | Bogachev, Kirill (Rock Flow Dynamics)
The application of intelligent algorithms that use clever simplifications and methods to solve computationallycomplex problems are rapidly displacing traditional methods in the petroleum industry. The latest forward-thinking approaches inhistory matching and uncertainty quantification were applied on a dynamic model that has unknown permeability model. The original perm-poro profile was constructed based on synthetic data to compare Assisted History Matching (AHM)approach to the exact solution. It is assumed that relative permeabilities, endpoints, or any parameter other than absolute permeability to match oil/water/gas rates, gas-oil ratio, water injection rate, watercut and bottomhole pressure cannot be modified.
The standard approach is to match a model via permeability variation is to split the grid into several regions. However, this process is a complete guess as it is unclear in advance how to select regions. The geological prerequisites for such splitting usually do not exist. Moreover, the values of permeability and porosity in different grid blocks are correlated. Independent change of these values for each region distortscorrelations or make the model unphysical.
The proposed alternative involves the decomposition of permeability model into spectrum amplitudes using Discrete Cosine Transformation (DCT), which is a form of Fourier Transform. The sum of all amplitudes in DCT is equal to the original property distribution. Uncertain permeability model typically involves subjective judgment, and several optimization runs to construct uncertainty matrix. However, the proposed multi-objective Particle Swarm Optimization (PSO) helps to reduce randomness and find optimal undominated by any other objective solution with fewer runs. Further optimization of Flexi-PSO algorithm is performed on its constituting components such as swarm size, inertia, nostalgia, sociality, damping factor, neighbor count, neighborliness, the proportion of explorers, egoism, community and relative critical distance to increase the speed of convergence. Additionally, the clustering technique, such as Principal Component Analysis (PCA), is suggested as a mean to reduce the space dimensionality of resulted solutions while ensuring the diversity of selected cluster centers.
The presentedset of methodshelps to achieve a qualitative and quantitative match with respect to any property, reduce the number of uncertainty parameters, setup ageneric and efficient approach towards assisted history matching.
The objective of this work is the prediction of water salinity evolution trend for Mexico Area-1 development that foresees the injection of a mixture of seawater and produced water from the six different reservoirs connected to the same FPSO.
Prediction of salinity trend evolution is crucial for forecasting possible biogenic hydrogen sulphide (H2S) formation and foreseeing the relating impacts over completion and facility material selection and on health, safety and environment (HSE) management.
Traditional numerical simulations through stand-alone models do not consider the effects of the reciprocal interaction among the fields on production profiles and are not able to simulate salinity evolution of produced and injected water mixture, variable over time. To overcome this limit, a new tool was developed. It consists in a python script that, introduced into the Area-1 Integrated Asset Model, allowed to generate forecasts of the water salinity along the project lifetime. These simulations were essential for souring risk assessment, providing the following results: water salinity trend evolution at each injector well; water salinity trend evolution at each producer well; injection water breakthrough timing at the producer wells.
water salinity trend evolution at each injector well;
water salinity trend evolution at each producer well;
injection water breakthrough timing at the producer wells.
Moreover, it gave the opportunity to assess the injection strategy efficiency and to quantify the impact of changing salinity on water viscosity and on the field recovery.
In conclusion, the innovative methodology applied in the Area-1 IAM (Integrated Asset Model) permits to predict the salinity of injected water and to foresee salinity evolution of produced water generating several valuable information, providing a flexible tool that allows to investigate simultaneously several uncertainties related to the project and to evaluate promptly solutions and mitigation.
Moreover, when the reservoirs will be on production, the numerical models integrated with the developed script will reproduce the historical salinity data allowing to identify preferential flow path established by fluids virtually acting as a reservoir tracer technology.
With the advent of high-resolution methods to predict hydraulic fracture geometry and subsequent production forecasting, characterization of productive shale volume and evaluating completion design economics through science-based forward modeling becomes possible. However, operationalizing a simulation-based workflow to optimize design to keep up with the field operation schedule remains the biggest challenge owing to the slow model-to-design turnaround cycle. The objective of this project is to apply the ensemble learning-based model concept to this issue and, for the purpose of completion design, we summarize the numerical-model-centric unconventional workflow as a process that ultimately models production from a well pad (of multiple horizontal laterals) as a function of completion design parameters. After the development and validation and analysis of the surrogate model is completed, the model can be used in the predictive mode to respond to the "what if" questions that are raised by the reservoir/completion management team.
The traditional trial and error approach of history matching to obtain an accurate model requires engineers to control each uncertain parameter and can be quite time consuming and inefficient. However, automatic history matching (AHM), assisted by computers, is an efficient process to control a large number of parameters simultaneously by an algorithm that integrates a static model with dynamic data to minimize a misfit for improving reliability. It helps to reduce simulation run time as well.
Particle Swarm Optimization (PSO) is a population based stochastic algorithm that can explore parameter space combined with the least squares single objective function. The process of AHM can adopt parameterization and realization methods to reduce inverse problems. In this study, realizations of various reservoir properties such as porosity, net to gross, relative permeability, horizontal and vertical permeability, and aquifer size were chosen for controlling throughout the AHM. History matching was conducted to validate the efficiency of each method. The guidelines for optimized AHM with a stochastic algorithm are also disccussed.
The realization and parameterization methods improved matching results in a full-field application with resulting in a reduced misfit and in less. A stochastic algorithm generates multiple models to deduce control parameters to reduce a misfit. In this study we identified that PSO converged effectively with updated control parameters. The optimized AHM improved the accuracy of a full-field model although some misfit remained in the match to bottomhole pressure.
We found that updating with too many parameters makes the problem difficult to solve while using too few leads to false convergence. In addition, while the simulation run time is critical, a full-field simulation model with reduced computational overhead is benefitial.
In this study, we observed that the PSO was an efficient algorithm to update control parameters to reduce a misfit. Using the parameterization and realization as an assisted method helped find better results. Overall this study can be used as a guideline to optimize the process of history matching.
While Image processing is still an area of research, standard workflows have emerged and are routinely used in Oil&Gas companies.
However, while hardware capabilities have increased consequently, allowing large samples to be scanned with a high fidelity, permeability simulations are still limited to small samples unless having access to HPC. Direct simulations are known to be more flexible in terms of type of rocks, but limited in terms of sample size, while Pore Network Model based allow much larger sample sizes but less rock types.
In this study, we will focus on the pore space analysis of a middle-eastern carbonate sample. The rock sample is 7.5 cm tall and has a diameter of 3.8 cm.
It has been acquired at 3 different resolution: a microCT scan at 16μm, a microCT scan of a 10 mm of diameter subsample at 5 μm, and a 10 mm of diameter SEM section at 2μm.
This study will propose a methodology to mix the different scales in order to get an accurate pore space analysis of the largest possible sample size.
As micro porous regions are visible at every scale, bringing uncertainty to the segmentation step, the first part of our analysis will consist of determining the most accurate pore space at the three different resolutions. We will rely on image registration (2D to 3D and 3D to 3D) and image based upscaling methods, further validated by simulation results.
Given the large numerical size of the samples, specific workflows involving large data 3D visualization and processing will be presented.
Then, different measures will be conducted: porosity and connected porosity, absolute permeability with three different methods (Lattice Boltzmann, Finite Volume, Pore Network Modeling), relative permeability curves using a Pore Network Model simulator. A new pore network model generation applicable to highly concave pore spaces such as carbonates ones will also be introduced.
A scalable method using automation will be presented, so that repeating the simulations on different samples of different space origins and size is easy.
We will expose the results and limits of every method and will determine which size is bringing a convergence of the results. We will especially look at the convergence of direct based simulations and pore network model based ones, such that expanding the size prior to Pore Network Model generation can be reliable.
In addition to the benchmark of the different simulation methods and their associated limits, the results will help us determining the representative elementary volume at different resolutions and the associated uncertainty depending on whether sub-resolution acquisitions are available or not.
microCTs and SEM image of the carbonate rock sample