The ensemble based methods (especially various forms of iterative ensemble smoothers) have been proven to be effective in calibrating multiple reservoir models, so that they are consistent with historical production data. However, due to the complex nature of hydrocarbon reservoirs, the model calibration is never perfect, it is always a simplified version of reality with coarse representation and unmodeled physical processes. This flaw in the model that causes mismatch between actual observations and simulated data when ‘perfect’ model parameters are used as model input is known as ‘model error’. Assimilation of data without accounting for this model error can result in incorrect adjustment to model parameters, underestimation of prediction uncertainties and bias in forecasts.
In this paper, we investigate the benefit of recognising and accounting for model error when an iterative ensemble smoother is used to assimilate production data. The correlated ‘total error’ (combination of model error and observation error) are estimated from the data residual after a standard history matching using Levenberg-Marquardt form of iterative ensemble smoother (LM-EnRML). This total error is then used in further data assimilations to improve the model prediction and uncertain quantification from the final updated model ensemble. We first illustrate the method using a synthetic 2D five spot case, where some model errors are deliberately introduced, and the results are closely examined against the known ‘true’ model. Then the Norne field case is used to further evaluate the method.
The Norne model has previously been history matched using the LM-EnRML (
In this paper, we use a combination of acoustic impedance and production data for history matching the full Norne Field. The purpose of the paper is to illustrate a robust and flexible work flow for assisted history matching of large data sets. We apply an iterative ensemble-based smoother, and the traditional approach for assisted history matching is extended to include updates of additional parameters representing rock clay content, which has a significant effect on seismic data. Further, for seismic data it is a challenge to properly specify the measurement noise, because the noise level and spatial correlation between measurement noise are unknown. For this purpose, we apply a method based on image denoising for estimating the spatially correlated (colored) noise level in the data. For the best possible evaluation of the workflow performance, all data are synthetically generated in this study. We assimilate production data and seismic data sequentially. First, the production data are assimilated using traditional distance-based localization, and the resulting ensemble of reservoir models is then used when assimilating seismic data. This procedure is suitable for real field applications, because production data are usually available before seismic data. If both production data and seismic data are assimilated simultaneously, the high number of seismic data might dominate the overall history-matching performance.
The noise estimation for seismic data involves transforming the observations to a discrete wavelet domain. However, the resulting data do not have a clear spatial position, and the traditional distance-based localization schemes used to avoid spurious correlations and underestimated uncertainty (because of limited ensemble size), are not possible to apply. Instead, we use a localization scheme that is based on correlations between observations and parameters that does not rely on physical position for model variables or data. This method automatically adapts to each observation and iteration.
The results show that we reduce data mismatch for both production and seismic data, and that the use of seismic data reduces estimation errors for porosity, permeability, and net-to-gross ratio (NTG). Such improvements can provide useful information for reservoir management and planning for additional drainage strategies.
The field development phase prior to investment sanction is characterized by relatively large uncertainties at the time important decisions have to be made. It is, for instance, crucial to select an appropriate recovery strategy (depletion or injection) to obtain optimal hydrocarbon cumulative production whilst ensuring good profitability of the project. Evaluation of reservoir as well as economic uncertainties and quantification of their impact are needed before the field development concept selection.
This paper describes how to stochastically assess reservoir and economic uncertainties and the screening process used to select the best recovery strategy. The chosen methodology is the combination of uncertainty studies, including both continuous, discrete and controllable parameters. The different screened scenarios are combined in a stochastic decision tree, built-up through decision and chance nodes, to establish a distribution of recoverable volumes and rank the recovery strategies given a chosen criterion. A second uncertainty study is performed by adding economic uncertainties to the initial set of reservoir uncertain parameters. Eventually a new decision tree is established and scenarios ranked using economic criteria.
The application of this methodology to an oil field from the Norwegian continental shelf and how recovery strategies are ranked are presented in this paper. The described methodology has exhibited the risks and uncertainties carried by the project, as it was possible to rank the different solutions based on the dispersion of the recoverable volumes distribution and/or on the net present value (NPV). In the context of a marginal or large capex project, a robust P90 case is required and this may therefore influence the choice of the recovery strategy. For instance, a scenario yielding the largest hydrocarbon volume may not be selected because it requires too many wells and/or too large investment if one of these criteria is defined as the most important. In addition, the combination of uncertainty studies enabled a full economic evaluation covering the entire recoverable volumes distribution whereas in many projects economic evaluation is focused on the P90, Mean and P10 scenarios.
The two-step integrated approach allows a decision to be made whilst taking into account both reservoir and economic aspects. Having a combined stochastic approach to the reservoir and economic uncertainties avoids a biased decision. All cases are stochastically covered and screened using a systematic and unified methodology that gives the same weight to each scenario.
Islam, M. S. (Dhofar University Salalah) | Kleppe, J. (Norwegian University of Science and Technology) | Rahman, M. M. (Bangladesh University of Engineering and Technology) | Abbasi, F. (Dhofar University Salalah)
Low salinity water-flooding (LSW) is a promising Improved Oil Recovery (IOR) process in which the salinity of the injected water is controlled to progress oil recovery over conventional seawater-flooding and other EOR methods. Published laboratory studies and field test cases in the last two decades have suggested several mechanisms of oil recovery by LSW, which is still an immature area of research. However, the commercial reservoir simulators have limited capability to model LSW accurately. The principal objective of this paper is to evaluate the potential of IOR using LSW for the Norne Field's E-segment.
There is about 60% oil is still trapped as residual oil especially in the Ile and Tofte formations of the Norne Field's E-Segment even after the completion of primary recovery and seawater-flooding as a secondary recovery technique. LSW simulation is, therefore, run for a period of 18 years from 2005 to 2022 to extract this residual oil adhered to rock wall. LSW simulation studies using original wells indicated that water injection with optimal salt concentration of 1,000 ppm TDS (total dissolved salts) or 1.0 kg/m3 yields substantially higher oil production (34.13 MSCM) compared to seawater-flooding (32.95 MSCM).
With the encouraging finding of LSW, the next stage of the study is involving simulation of different scenarios. Six different cases have been investigated. Case 1 is the base case with seawaterflooding using the existing wells. The other five cases are all LSW, with the following well configurations: (2) using the original wells, (3) using the original wells in addition to a new producing well, (4) using the original wells in addition to a re-completed producing well, (5) using the original wells in addition to a new injection well, and (6) using the original wells in addition to a re-completed injection well. Case 3 indicated the highest oil recovery (50.10 MSCM) compare to other five cases. Thus, it could be concluded that the Norne Field's E-Segment is a good candidate for LSW to extract the residual or trapped oil.
In this study, LSW simulation results have shown the alteration of wettability from neutral-wet to strongly water-wet, which in turn is resulting in a favorable mobility ratio, is the most important IOR mechanism for the Norne Field's E-Segment. LSW in this case is found to give near to piston-like displacement.
Islam, M. S. (Dhofar University) | Kleppe, J. (Norwegian University of Science and Technology) | Abbassi, F. (Dhofar University) | Haque, M. F. (Bangladesh Petroleum Exploration and Production Company Limited)
The endeavor of this study is to evaluate the economic potential of the Norne Field's E-Segment (a Norwegian Offshore Oil Field) under different cost structure for six different field development strategies based on the simulation results of low salinity water-flooding (LSW).
The ultimate oil recovery of the Norne Field's E-Segment is ca. 40% after employing the combination of primary and seawater-flooding as a secondary recovery technique. Therefore, there is about 60% oil is still trapped as a result of high capillary action of water. This trapped oil could be extracted by introducing a novel recovery mechanism. In this case, LSW is considered to extract this residual oil adhered to rock wall.LSW simulation studies using original wells indicated that the water injection with optimal salt concentration of 1,000 ppm TDS (total dissolved salts) or 1.0 kg/m3 yields substantially higher oil production compared to sea water-flooding. Having found that the LSW is an effective Improved Oil Recovery (IOR) technique for the Norne Field's E-Segment, the next tasks are to find the different ways to increase oil recovery using LSW followed by economic feasibility study. Six different well development plans are investigated. For example, case 1 is the base case with seawaterflooding using the existing wells. The other five cases are all LSW, with the following well configurations: (2) using the original wells, (3) using the original wells in addition to a new producing well, (4) using the original wells in addition to a re-completed producing well, (5) using the original wells in addition to a new injection well, and (6) using the original wells in addition to a re-completed injection well. Economic feasibility study has been conducted for all these six cases using Net Present Value (NPV), Profitability Index (PI), and Internal Rate of Return (IRR) methods.
Analyzing the simulation results of the above six field development plans, it has been revealed that the oil production for five LSW cases are higher than the seawater-flooding case which, in turn, result more NPV, more PI, and more IRR for LSW cases. Among the six field development cases, Case-3 has been indicated the highest cumulative oil recovery compare to other five cases and give the highest NPV. Thus, it could be concluded that the Norne Field's E-Segment is a good candidate for LSW to extract the residual or trapped oil.
In this study, LSW simulation results are found to give a favorable mobility ratio, which in turn is resulting the highest oil recovery and the highest revenue.
Ensemble-based methods are among the state-of-the-art history matching algorithms. In practice, they often suffer from ensemble collapse, a phenomenon that deteriorates history matching performance. To prevent ensemble collapse, it is customary to equip an ensemble history matching algorithm with a certain localization scheme. Conventional localization methods use distances between physical locations of model variables and observations to modify the degree of observations' influence on model updates. Distance- based localization methods work well in many problems, but they also suffer from some long-standing issues, including, for instance, the dependence on the presence of physical locations of both model variables and observations, the challenges in dealing with nonlocal and time-lapse observations, and the non-adaptivity to handle different types of model variables. To enhance the applicability of localization to various history matching problems, we propose to adopt an adaptive localization scheme that exploits the correlations between model variables and observations for localization. We elaborate how correlation-based adaptive localization can mitigate or overcome the noticed issues arising in conventional distance-based localization.
To demonstrate the efficacy of correlation-based adaptive localization, we apply it to history-match the real production data of the full Norne field model using an iterative ensemble smoother (iES), and compare the history matching results to those obtained by using the same iES but with distance-based localization. Our study indicates that, in comparison to distance-based localization, correlation- based localization not only achieves close or better performance in terms of data mismatch, but also is more convenient to implement and use in practical history matching problems. As a result, the proposed correlation-based localization scheme may serve as a viable alternative to conventional distance-based localization.
Optimizing drainage strategy is an important part of petroleum reservoir management, and has to be implemented as a dynamic process. This work describes a unique combination of standalone and coupled reservoir simulation modeling as well as production decline curve analysis and tracer data interpretation to establish optimal gas injection strategy in Smørbukk and Smørbukk Sør fields in the Norwegian Sea for improved reservoir management in terms of reserves and economy. The drainage strategy challenge of Smørbukk and Smørbukk Sør fields is how to prioritize gas export from the fields and how to allocate the remaining gas in the efficient injectors for enhanced condensate recovery. The results triggered a change in injection strategy to rank gas disposition internally in optimized locations both in Smørbukk and Smørbukk Sør fields, and to split the injection between the two fields. The results showed that gas-oil-ratio development of producers, gas injection efficiency (GIF), well to well tracer communication and produced water-gas ratio development are the key factors. Reservoir segments with lower produced gas-oil ratio and higher water-gas ratio development have shown the most favorable locations for gas injection. It is demonstrated also that injection efficiency decreases by time; accordingly the drainage strategy consists in two phases: Optimized gas cycling period into efficient injectors in Smørbukk and Smørbukk Sør fields and a blowdown phase including converting injectors to producers when the injection is not efficient. The work addresses the workflow and methodology for drainage strategy, main challenges and related lessons associated to reservoir simulation results, compared with production data and tracer results. The paper will shed light on the future of reservoir management and forecasting of drainage strategy for matured complex fields.
This work describes a methodology that evaluates the Discrete Latin Hypercube with Geostatistical Realizations (DLHG) sample size for complex models in the history matching under uncertainties process with application to the Norne Benchmark Case. The sample size affects the time demanded and results accuracy in a history matching process because a small sample size can yield inaccurate risk quantification and a high sample size can demand excessive time to reach good results. Both factors should be evaluated in order to improve the project's efficiency and to obtain reliable results. Such evaluation gains greater importance in complex reservoir models because the number of tests to determine the reservoir scenarios that match dynamic data can be high due to the level of complexity. The methodology presented in this work is divided in three steps. First, we evaluate the ability of DLHG to produce output cumulative distribution functions (CDF) that replicate a more exhaustive sampling technique (Monte Carlo) using the Kolmogorov-Smirnov test. The output is the misfit between observed and simulated production rates; then, we compare the influence and correlation matrices obtained with DLHG and Monte Carlo samples. The influence matrix shows the impact of the uncertainty variation on the outputs and the correlation matrix measures the strength of the dependence between the uncertainty attributes and outputs. Finally, we perform the stability test. The methodology was applied to the Norne benchmark case; a field located in the Norwegian Sea. The main characteristics of the methodology are: (1) it uses a statistical technique to compare the output CDFs from the reference and DLHG samples and (2) it evaluates the ability of the DLHG sample to identify the reservoir attributes that affect the history match results. We evaluated DLHG sample sizes of 20, 50, 100 and 200, and considered a MC sample size of 5,000 to the Norne benchmark case. The DLHG CDFs for the 100 sample size was able to accurately replicate the corresponding MC CDFs, however it did not replicated the behavior of the influence and correlation matrices. The DLHG sample size of 200 was able to reproduce the CDFs outputs, the influence and correlation matrices and it was considered stable. The study showed that even if the sample size is able to represent the CDFs outputs from a reference solution, the influence and correlation matrices should be evaluated. The methodology presented can be incorporated into usual history match routines.
The Markov chain Monte Carlo (McMC) stochastic approach is widely used to estimate subsurface properties. However, estimating uncertainty quantitatively is also very important when performing stochastic inversion. Therefore, the goal of this paper is to apply the transdimensional, or reversible jump, McMC (rjMcMC) method to obtain a 3-D seismic impedance model and to determine a corresponding uncertainty cube by estimating the standard deviation of the models that are included in the Markov chains. By combining the uncertainty volume and impedance models, we can estimate the acoustic impedance and the uncertainty of the layer boundary location. The uncertainty can also be related to the magnitude of velocity discontinuity. To demonstrate the performance and reliability of the rjMcMC inversion, we used the seismic data from the E-segment of Norne field in Norwegian Sea. The results of transdimensional McMC inversion show high velocity contrasts nearby gas-oil contacts and high uncertainty near discontinuities.
Presentation Date: Wednesday, September 27, 2017
Start Time: 8:55 AM
Presentation Type: ORAL
In this paper we propose a proxy model based seismic history matching (SHM), and apply it to time-lapse (4D) seismic data from a Norwegian Sea field. A stable proxy model is developed for generating 4D seismic attributes by using only the original baseline seismic data and dynamic pressure and saturation predictions from reservoir flow simulation. This method (
In this study we firstly perform a check on the validity and accuracy of the proxy approach following the methodology of (