Carbonate Brazilian pre-salt fields have a large number of faults detected by seismic and well data. Nevertheless, because of limitations in seismic resolution, all existent faults cannot be identified. That is one of the main challenges for understanding related heterogeneities (vugs, karst) and the flow behavior. This paper deals with a fault analysis and modeling using an original approach and fault data of three pre-salt reservoirs.
One possible approach for characterizing and modeling the fault network (
The results presented on this article lead us to discuss the importance of how to choose the samples for modeling sub-seismic faults based on the ensemble of seismic faults available. This article answers the question about which available seismic faults we should use for estimating fractal dimension, should we use all available seismic faults near of the reservoir area or use only the faults inside the reservoir contour. After this short discussion on the fractal dimension choice from a spatial distribution point of view, the impact of this choice on flow was illustrated. The sub-seismic fault models were modeled using different fractal dimension. Subsequently, an upscaling step using analytical upscaling (
Characterizing sub-seismic faults has a major impact on the overall flow behavior of the field. The chosen methodology has been applied only on synthetic cases but never published using real data. This work will interest a practicing engineer. The fault network of these neighbor reservoirs allows us to illustrate the importance on the choice of fractal dimension for characterizing the fault network and its impact on the subseismic models and fluid displacement, consequently on production.
Nandi Formentin, Helena (Durham University and University of Campinas) | Vernon, Ian (Durham University) | Avansi, Guilherme Daniel (University of Campinas) | Caiado, Camila (Durham University) | Maschio, Célio (University of Campinas) | Goldstein, Michael (Durham University) | Schiozer, Denis José (University of Campinas)
Reservoir simulation models incorporate physical laws and reservoir characteristics. They represent our understanding of sub-surface structures based on the available information. Emulators are statistical representations of simulation models, offering fast evaluations of a sufficiently large number of reservoir scenarios, to enable a full uncertainty analysis. Bayesian History Matching (BHM) aims to find the range of reservoir scenarios that are consistent with the historical data, in order to provide comprehensive evaluation of reservoir performance and consistent, unbiased predictions incorporating realistic levels of uncertainty, required for full asset management. We describe a systematic approach for uncertainty quantification that combines reservoir simulation and emulation techniques within a coherent Bayesian framework for uncertainty quantification.
Our systematic procedure is an alternative and more rigorous tool for reservoir studies dealing with probabilistic uncertainty reduction. It comprises the design of sets of simulation scenarios to facilitate the construction of emulators, capable of accurately mimicking the simulator with known levels of uncertainty. Emulators can be used to accelerate the steps requiring large numbers of evaluations of the input space in order to be valid from a statistical perspective. Via implausibility measures, we compare emulated outputs with historical data incorporating major process uncertainties. Then, we iteratively identify regions of input parameter space unlikely to provide acceptable matches, performing more runs and reconstructing more accurate emulators at each wave, an approach that benefits from several efficiency improvements. We provide a workflow covering each stage of this procedure.
The procedure was applied to reduce uncertainty in a complex reservoir case study with 25 injection and production wells. The case study contains 26 uncertain attributes representing petrophysical, rock-fluid and fluid properties. We selected phases of evaluation considering specific events during the reservoir management, improving the efficiency of simulation resources use. We identified and addressed data patterns untracked in previous studies: simulator targets,
We advance the applicability of Bayesian History Matching for reservoir studies with four deliveries: (a) a general workflow for systematic BHM, (b) the use of phases to progressively evaluate the historical data; and (c) the integration of two-class emulators in the BHM formulation. Finally, we demonstrate the internal discrepancy as a source of error in the reservoir model.
Seismic data usually has lower vertical resolution than reservoir simulation models so it is a common practice to generate maps of 4D attributes to be used as the observed data to calibrate models. In such a case, simulation results are converted to seismic attributes and a map is generated by averaging the corresponding layers. Although this seems to be a fair practice, here we show that this procedure can present some drawbacks and propose a new approach to ensure a proper data comparison.
The first step of the proposed procedure follows the traditional sequence where seismic attributes are generated by running a petro-elastic model (PEM) with reservoir simulation data, at the simulation scale. Then, instead of averaging the simulation layers, we propose to resample the simulation grid to a seismic grid and filter the seismic impedances to the seismic frequency. Lastly, we extract the map from the regular grid to be compared with the observed 4D seismic. This procedure is performed in the depth domain and allows a straight and fair comparison of the two dataset.
A synthetic dataset based on a Brazilian field produced through water injection is used to validate this procedure. This dataset is composed by a synthetic 4D seismic data (observed data) generated by a consistent seismic modeling and inversion and a set of reservoir simulation models (to be matched). We computed seismic impedance for each simulation model by applying a PEM and two maps were generated for each model: (1) by averaging impedance values throughout the corresponding layers and (2) by applying the proposed procedure. When these maps are subtracted from the observed data (error maps), as would happen in a quantitative seismic history matching, we note a relevant differences. In the dataset used, we observed that if the vertical resolution issue is not considered (Case 1) the error map presents a strong bias that would erroneously force a decrease on the water saturation to match the observed data in a seismic history matching. While the map generated in Case 2 presents the errors better balanced and related to actual water movement differences rather than being a consequence of scale and resolution issues.
The novelty of this work is a quick way to bring simulation data to seismic resolution without going through all seismic modeling process ensuring a proper data comparison, which can be promptly added in seismic history matching process.
Gomes, Carlos Eduardo de Aguiar Nogueira (PETROBRAS) | Maschio, Celio (University of Campinas) | Paes, Vinicius Costa Lopes (PETROBRAS) | Correia, Manuel Gomes (University of Campinas) | Câmara, Paulo Sérgio (PETROBRAS) | Santos, Antonio Alberto de Souza dos (University of Campinas) | Schiozer, Denis José (University of Campinas) | Silva, Marcia Ida Oliveira (PETROBRAS) | Dos Santos, Marcos Sebastião (PETROBRAS) | Anyzewski, Alessandra Silva
This work applies a new methodology to assimilate multi-objective data (production, injection and, pressure of all wells) based on five of the twelve steps described by
Giant reservoirs such as Lula (Santos Oil Basin, Brazil) and Ghawar (Saudi Arabia) have high-permeability intervals, known as super-k zones, associated with thin layers. Modeling these small-scale flow features in large-scale simulation models is difficult. Current methods are limited by high computational costs or simplifications that mismatch the representation of these features in simulation gridblocks. This work has two purposes: present an upscaling work flow to integrate highly laminated or interbedded reservoirs with thin, highly permeable layers in reservoir simulations through a combination of an explicit modeling of super-k layers using the Parsons (1966) formula and dual-medium flow models, and compare this method with two conventional upscaling approaches that are available in commercial software.
We use the benchmark model UNISIM-II-R (Correia et al. 2015a), a fine single-porosity grid dependent on field information from the Brazilian presalt and Ghawar oil fields, as the reference solution to compare the upscaling matching between the three methods. We compare oil recovery factor (ORF), water cut (WC), average reservoir pressure (RP), water front, and the time consumption for simulation. Our proposed Parson’s dual-medium (PDP) methodology achieved better upscaling matches with the reference model and had minimal time consumption compared with the representation of super-k layers through an implicit matrix modeling by single-porosity flow models (IMP) and through the explicit representation of super-k zones in the fracture system of dual-medium flow models (DFNDP).
History matching (HM) is a complex process that aims to increase the reliability of reservoir simulation models. HM is an inverse problem with multiple solutions that calls for a probabilistic approach. When observed data are integrated with sampling methods, uncertainty can be reduced by updating the probability density function (
The amount of information available for field-development planning is limited, forcing the production strategy (PS) to be designed with a great amount of uncertainty. During its implementation, new information allows the adaptation of the strategy for economic gain. This work reproduces the field-development process under geological uncertainty in case study UNISIM-I-D (benchmark case that is based on Namorado Field in Brazil). The main objectives are to evaluate the process and to observe the evolution of risk curves, all in a controlled environment with real-field features.
The methodology generates new geostatistical images on the basis of new well logs, assimilates production data with an ensemble-based method, and reoptimizes the PS with a hybrid algorithm. The field development is carried out by repeatedly applying this framework with human supervision. Each step is customized with algorithms to simplify the implementation and to reduce computational effort, making this methodology more appealing for practical use. New data are collected from a high-resolution reference model that does not belong to the ensemble of models. The process starts with a PS, previously optimized under the uncertainties of the case study, which yields the real economic outcome within the original uncertainty range.
Results show high-quality history matching (HM) that excessively reduced the risk range and the variability of the updated model sets. Optimizations on the PS, on the basis of the updated ensembles, consistently increased the expected monetary value (EMV) of the project without guaranteeing an increment in the real net present value (NPV). Applying the methodology repeatedly throughout the field development increased the EMV by 29% (from USD 1.532 billion to USD 1.975 billion), whereas the real NPV decreased 2% (from USD 1.346 billion to USD 1.319 billion), falling out of the expected range and revealing that the model sets did not fully represent the real field. The lack of good representation is aggravated by heterogeneities inherent to the unknown reservoir, which are difficult to identify with only well logs and production data.
The results from the application of a closed-loop reservoir development process in a controlled environment warn against similar hidden mechanisms happening on real-field developments under similar circumstances. They reveal intrinsic pitfalls in reservoir modeling that may contribute to production-forecast problems and call for a reflection on how reservoir uncertainty assessment is performed. We prove that large sets of models do not guarantee coverage of geologic uncertainties because they do not fully represent the real reservoir. The field-development process naturally changes the risk curves, contributing to revealing the lack of representation.
Giant reservoirs such as Lula (Santos Oil Basin, Brazil) and Ghawar (Saudi Arabia) have high permeability intervals, known as super-k zones, associated with thin layers. Modeling these small-scale flow features in large-scale simulation models is complex. Current methods are limited by high computational costs or simplifications that mismatch the representation of these features in simulation grid blocks. This work has two purposes: (1) present an upscaling workflow to integrate highly laminated or inter-bedded reservoirs with thin, highly permeable layers in reservoir simulations through a combination of (a) an explicit modeling of super-k layers using Parsons (1966) formula and (b) dualmedium flow models, and (2) compare this method with two conventional upscaling approaches, available in commercial software. We use the benchmark model UNISIM-II-R, a fine single-porosity grid based on field information from the Brazilian Pre-salt and Ghawar oil fields, as the reference solution to compare the upscaling matching between the three methods. We compare; oil recovery, water cut, average reservoir pressure, waterfront, and the time consumption for simulation. Our proposed parsons dual-medium (PDP) methodology achieved better upscaling matches with the reference model and had minimal time consumption when compared with the representation of super-k layers through an implicit matrix modelling by single porosity flow models (IMP) and through the explicit representation of super-k zones in the fracture system of dual-medium flow models (DFNDP).
In the development of oil fields, many parameters have to be selected in optimization procedures, demanding large computational effort, especially when uncertainties are considered. Therefore, it is common to separate design variables (G1) and well control variables (G2) optimization in a hierarchical process, but this may yield suboptimal results. We propose a well control analysis under uncertainties to verify whether G2 can be optimized separately in the development phase of petroleum fields. We also investigate the impact of G2 optimization on economic returns when platforms limit the production and for more pessimistic production costs. We perform reactive and proactive G2 optimization under uncertainties on a Benchmark Model (UNISIM-I-D, based on Namorado field) in three cases. In Case I, we use previously optimized strategies for G1, while G2 were controlled in a simplified way considering geological and economic uncertainties. In Case II we investigate the G2 optimization based on expected monetary value (EMV) for a restricted platform. Case III adopts the same restriction of Case II, but considers a pessimistic economic scenario. We consider five procedures for G2 optimization: (P1) longterm bottom-hole pressure control; (P2) short-term rates control; (P4) long-term rates control; (P4) well shut-in time control; and (P5) a combination of these procedures. In Case I, the EMV low percentage increase after G2 optimization indicates that a hierarchical process can be used in similar problems. Yet, G2 should be optimized during the lifetime of the field to increase EMV without additional cost. Depending on the representative model (RM) that best characterizes the field, there are potential further gains of almost 200 million USD for one of the scenarios studied. We achieved an increase of EMV around 8% for Case II. This suggests that G2 have a greater economic return when the platform restricts the production and injection for the field and G1 were not previously optimized. Case III showed even higher EMV gain (34%) indicating the importance of considering the economic scenario when defining the G2 strategy for the field. To reduce the search space in optimization problems, conventional techniques focus on optimizing design variables, underestimating the influence of well control management on G1. Moreover, some works do not include uncertainties and operational constraints. The results of this work indicate that G1 and G2 can be optimized hierarchically for a situation in which platforms do not constraint production or injection (Case I), since G2 variables have lower influence. This conclusion is not valid for restricted platforms and a pessimistic economic scenario (as in Case II and III) where G2 have higher influence in the results.
Some difficulties are frequently associated with the integration of seismic and flow simulation datasets, especially related with the low vertical resolution of the seismic data and the uncertain estimations in the areas between the wells. One challenge is then the integration of both datasets in different scales, in order to take advantage of their characteristics. The present study proposes a redistribution of the reservoir saturation estimated with 4D seismic inversion methods, combining the information provided by the flow simulation in order to improve the quality of the estimations and their resolution.
The methodology comprehends a saturation redistribution algorithm that is applied to each reservoir block combining the information of two saturation maps: one derived from the 4D seismic data and the other estimated by the flow simulator. The final saturation estimation follows the vertical distribution given by the simulation data but keep the average behavior observed in the 4D seismic. In order to have a better control of the results, the methodology is applied to a synthetic dataset that includes a reservoir model with different grid resolutions: geomodel, simulation model and seismic model.
Two different case studies present the main results of the proposed methodology to improve the saturation predictions. The first case study represents an ideal solution being assumed that the base model (simulation model) is very similar to the reference model (geomodel that represents the true answer), remaining a few differences due to the different scales. In the second case study, the base model is selected between multiple realizations during the uncertainty reduction process. The results shows that is possible to improve the resolution of the saturation variation maps computed from 4D seismic data allowing the identification of new fine scale heterogeneities and providing better estimations of the saturation changes due to production. This methodology can also give additional clues in future history matching procedures through the identification of critical regions. With the continuous calibration of the models during a history matching process the results obtained with the redistribution method tend to improve, approaching the results obtained in the first case study.
The proposed redistribution method combines the best characteristics of the seismic and simulation data. It includes the higher sensibility of the seismic data to identify the areal distribution of the main anomalies and inserts the higher sensibility of the simulation data regarding the identification of vertical water flow trends due to gravitational effects. Thus, the procedure introduces, in the maps provided by 4D seismic, new information regarding the injection/production patterns that become more and more reliable as we approach the wells.