A giant brownfield re-development project with long horizontal wells was initiated to arrest production decline mainly caused by a lack of pressure support and free gas influx from the large gas cap.
Key value drivers for the project are developing an understanding of the layers with regards to gas breakthrough, and achieving capital efficiency through low-cost well delivery, better planning and technology applications.
Firstly, the field has been segmented based on the analysis of multiple factors influencing the free gas production. It considers geological aspects such as the study of depositional environment and diagenesis, structural elements such as high permeability streaks and fractures, dynamic behaviors such as the water injection efficiency, gas cap expansion or coning.
Secondly, numerical simulations were then run in order to rank the sectors based on the expected model performance, compare them with real data categorization, and test the effect of the new proposed development schemes such as water injection at gas-oil contact and long horizontal wells equipped with downhole control valves.
It was found that each sector has a specific production mechanism and appropriate developments were recommended and then tested in the simulation. For instance, high permeability streaks play a significant role on the development of some sectors instigating a big difference of maturity between sub-layers, early water or gas breakthrough. Also, the inefficiency of water injection is one of the biggest issues of the field. Most of the water injectors are located too far from the oil producers, and have a low injectivity due to the often degraded facies in the aquifer because of diagenesis. This leads to a lack of pressure support that is counterbalanced by the gas injection, ending up with a lot of high GOR wells and a bad sweep from the top of the structure as the gas tends to by-pass the oil.
Simulation work showed that several remaining zones are safe for immediate development and should be prioritized for development in the near future. On the other hand, some of the mature layers prone to gas and water breakthrough need a boost for development, such as water injection at gas-oil-contact, artificial lift, low pressure system, GOR relaxation. Tight and undeveloped reservoirs are improved by implementing long horizontal drains.
When key geological scenario uncertainties, captured in multiple conceptual models, are combined with continuous parameters, the evaluation of a representative sample set quickly becomes unmanageable, laborious and too time consuming to execute. A workflow is presented that enables users to easily model conceptual as well as parametric uncertainties of the reservoir without the necessity of any complex scripting. The chain of models for all concepts is presented in one view, to provide overview of the key differences between concepts used. An ensemble of geologically sound samples can be created taking into account parameter dependencies and probabilities of concepts. The chain of models per concept can easily be (re)executed.
A case study is presented that consists of multiple concepts based on different hierarchical stratigraphic models in combination with different fault models, each of which with its own fluid- (defined contacts per compartment), grid- (sub-layering and areal resolution) and rock property models. Volumetric calculations are run on an ensemble to get static model observables like GRV, Pore Volume, Oil-In-Place, etc., reported by multiple sub-regions of the model in combination with a lease boundary. (When coupled with dynamic simulation, observables like ultimate recovery, break-through timing, etc. could also be obtained). As thousands of realizations were run concurrently, run time was reduced from weeks to hours. Results reveal the distribution and dependency of observables like GRV on top-structure-depth uncertainty and contact-level uncertainty. For in-place volumes the full suite of concepts and other parametric uncertainties including the stochastic uncertainties (i.e. seed) is analyzed. This also enables the identification of the key uncertainties that impact equity the most, which can be of great commercial value during equity negotiations. This workflow demonstrates how, with the power of Cloud computing, rigorous evaluation of multiple concepts combined with many parametric uncertainties has been achieved within practical turn-around times. As such it overcomes the prohibitive hurdles of the past that often have led to simplifications necessary to save time and effort. The result is better decision quality in resource development decisions.
Hydrocarbon in place volumes are often inaccurate as a result of poor representation of the reservoir structure (by means of a 3D grid), that in combination with the use of traditional saturation calculation methods, lead to erroneous hydrocarbon volumes and poor investment decisions.
Traditionally a reservoir model is represented with a 3D grid, in a complex setting such as fault intersections and stacked reservoirs. A corner point grid is often used, which has limitations to represent this complexity. Further, the hydrocarbon saturations are then derived on a cell by cell basis on that 3D grid using simple averaging techniques of saturation height functions. The poor structure representation on the pillar grid in addition to the simplistic averaging methods lead to inaccuracies of the in place volumes especially where a prominent transition zone is present.
This paper presents new advanced saturation averaging methods (volume and height weighted) using saturation height functions on 3D grids. The new advanced saturation averaging methods are used on different reservoir models to compare the saturation distribution and volumetric differences against the traditional saturation calculation methods. A 4-way dip closure reservoir model with a tilted free water level (typical example of a carbonate reservoir in the Middle East), and a faulted S-grid model of the F3-FA field (North Sea) are used.
For the 4-way dip closure reservoir model, when comparing the advanced ‘volume weighted’ and traditional ‘by center of the part of the cell’ saturation averaging methods, a significant difference in the water saturations is observed which leads to about 5% difference in the calculation of in place hydrocarbon volumes. Further, it is observed that changing the thickness and orientation of the 3D grid cells can result in even larger differences of 5-10%.
The faulted F3 model shows that the difference between the hydrocarbon saturation values is largest where it matters most, that is, around the fluid contacts and in the transition zone. The new advanced saturation averaging methods give accurate hydrocarbon saturations irrespective of the size or complexity of the 3D grid and without any discretization effects.
Waterflooding is the main technic to recover hydrocarbons in reservoirs. For a given set of wells (injectors and producers), the choice of injection/production parameters such as pressures, flow rates, and locations of these boundary conditions have a significant impact on the operating life of the wells. As a large number of combinations of these parameters are possible, one of the critical decision to make is to identify an optimal set of these parameters. Using the reservoir simulator directly to evaluate the impact of these sets being unrealistic considering the required number of simulations, a common approach consists of using response surfaces to approximate the reservoir simulator outputs. Several techniques involving proxies model (e.g., kriging, polynomial, and artificial neural network) have been suggested to replace the reservoir simulations. This paper focalizes on the application of artificial neural networks (ANN) as it is commonly admitted that the ANNs are the most efficient one due to their universal approximation capacity, i.e., capacity to reproduce any continuous function. This paper presents a complete workflow to optimize well parameters under waterflooding using an artificial neural network as a proxy model. The proposed methodology allows evaluating different production configurations that maximize the NPV according to a given risk. The optimized solutions can be analyzed with the efficient frontier plot and the Sharpe ratios. An application of the workflow to the Brugge field is presented in order to optimize the waterflooding strategy.
Taha, Taha (Emerson Automation Solutions) | Ward, Paul (Emerson Automation Solutions) | Peacock, Gavin (Emerson Automation Solutions) | Heritage, John (Emerson Automation Solutions) | Bordas, Rafel (Emerson Automation Solutions) | Aslam, Usman (Emerson Automation Solutions) | Walsh, Steve (Emerson Automation Solutions) | Hammersley, Richard (Emerson Automation Solutions) | Gringarten, Emmanuel (Emerson Automation Solutions)
This paper presents a case study in 4D seismic history matching using an automated, ensemble-based workflow that tightly integrates the static and dynamic domains. Subsurface uncertainties, captured at every stage of the interpretative and modelling process, are used as inputs within a repeatable workflow. By adjusting these inputs, an ensemble of models is created, and their likelihoods constrained by observations within an iterative loop. The result is multiple realizations of calibrated models that are consistent with the underlying geology, the observed production data, the seismic signature of the reservoir and its fluids. It is effectively a digital twin of the reservoir with an improved predictive ability that provides a realistic assessment of uncertainty associated with production forecasts.
The example used in this study is a synthetic 3D model mimicking a real North Sea field. Data assimilation is conducted using an Ensemble Smoother with multiple data assimilations (ES-MDA). This paper has a significant focus on seismic data, with the corresponding result vector generated via a petro-elastic model. 4D seismic data proves to be a key additional source of measurement data with a unique volumetric distribution creating a coherent predictive model. This allows recovery of the underlying geological features and more accurately models the uncertainty in predicted production than was possible by matching production data alone.
A significant advantage of this approach is the ability to utilize simultaneously multiple types of measurement data including production, RFT, PLT and 4D seismic. Newly acquired observations can be rapidly accommodated which is often critical as the value of most interventions is reduced by delay.
Data from seismic to production is integrated to build models to provide estimations of parameters such as petroleum volumetrics, pressure behavior, and production performance (
Reservoir dynamic simulation is the most applied process that integrates all reservoir data, where an Equation of State (EOS) is coupled with the objective to estimate the fluid thermodynamic state at each computational step. The simulation consists of iterative mathematical computations in which the reservoir-defined conditions at the previous time step is an input to determine the properties at the next and subsequent time steps. The calculated pressure is a fundamental variable in each time step, which means that a representative and high level of confidence Pressure Volume Temperature (PVT) model is required to avoid scale-up of errors resulting from fluid pressure estimation.
A PVT modeling includes three main stages: Fluid sample and data acquisition Laboratory analysis and fluid characterization The EOS model.
Fluid sample and data acquisition
Laboratory analysis and fluid characterization
The EOS model.
The emphasis in this work is on the EOS model, which is the fluid model used for the simulation process. The objective of this work is to analyze the main uncertainties associated with typical EOS modeling and defining the level of confidence of these EOS approaches. In this work, some of the most-used approaches for EOS modeling are reviewed. An assessment of these methods is also provided based on their application to actual petroleum fluids with the objective of defining their statistical level of confidence.
First, the study analyzes the sources of critical uncertainties in a PVT EOS model. Second, a statistical number of PVT laboratory studies of petroleum fluids is used to determine the level of confidence of four approaches that are based on the two well-known Peng-Robinson and Soave-Redlich-Kwong EOS. Third, statistical analysis is performed to determine the level of confidence of the different methods. Fourth, a correlation to determine the optimal number of pseudo-components is defined. These steps include: Characterization of fluid and heavy components Tuning Lumping.
Characterization of fluid and heavy components
As a result of this study, one can conclude: The level of confidence of the four analyzed approaches The significance of the difference between the analyzed methods A correlation to determine the optimal number of pseudo-components.
The level of confidence of the four analyzed approaches
The significance of the difference between the analyzed methods
A correlation to determine the optimal number of pseudo-components.
In this work, a statistical analysis over some of the most-used EOS modeling approaches and on a set of petroleum fluid PVTs was performed to determine the level of confidence of four EOS modeling methods. In addition, a correlation was introduced for
Downhole control devices are being widely implemented in fields globally; and, because of the costs involved in their implementation, a robust reservoir performance forecast is necessary. A prerequisite to a sound reservoir development plan is to have a robust history-matched reservoir simulation model. This study involves use of a downhole inflow control device (ICD) well configuration in the reservoir simulation model to perform history matching of a green-field offshore Abu Dhabi. The results of this approach are compared to the results from traditional approaches. The scope of this study is to examine the differences in both history match approaches.
Reservoir A is one of the major reservoirs of a green-field located offshore Abu Dhabi, and is being developed with a five-spot water injection pattern. The producers and water injectors are horizontal wells, which are drilled across different flow units within the reservoir. Because the reservoir is heterogeneous across all the flow units, the injection pattern results in a non-uniform water front. The conventional approach to history matching the well performance is to implement a positive skin factor across the well completions to mimic the effect of the inflow control devices (ICDs) installed in the well: increasing the pressure drop (ΔP) between the formation and the well tubing. In this study, the actual downhole configuration was prepared using well-completion analysis software, followed by use of a next-generation reservoir simulator to run the full field reservoir model for the history matching period.
As the field is being developed on the principles of digital concept, continuous high-frequency downhole pressure data is available in flowing as well as shut-in conditions. The use of this data, coupled with direct modeling of the ICDs in the simulation model, resulted in a significant improvement in the reliability of the history match, as compared to traditional approaches.
This study compares two history matching approaches for fields with wells completed with downhole control devices. The core purpose of this study is to integrate the principles of the digital oil field with conventional history matching techniques, with the ultimate goal of improving the history match.
Unexpected water accumulation (called perched water) can be present inside hydrocarbon bearing reservoirs. In case of limited or poor geophysical data, the prediction of this accumulation may be difficult.
In this paper, a real case is used to show how the presence of perched water was initially supposed and then verified through production data analysis.
During the development campaign of a deep water reservoir in West Africa, a water injector well found an unexpected shallower water table. To understand the nature of this water, the gas while drilling data of two oil producer drilled in the same area of the water injector were analysed. Based on this analysis the last meters of the open hole section of both oil producers were in water. The integration of gas while drilling data, stratigraphy, sedimentology and structural settings knowledge of the area suggested that this water was locally trapped during oil migration, most likely due to the presence of a structural barrier.
The two oil producer wells, located in the supposed perched water area, were successfully started-up. The behavior of both wells was daily monitored to understand and confirm the nature of perched water phenomenon. From day one, the two wells showed water production. After few weeks, the water cut of one well clearly started to reduce. For the other well, the water cut behavior was constant and only after one year of production the declining trend was appreciated. The observed declining trend of water production was the final confirmation that aquifer in this sector of the field is isolated and with limited extension. The water cut trend was also captured in the 3D dynamic reservoir model. In addition, tracers were implemented in the model to identify different water production sources (injected or perched) and to forecast their evolution during the field life.
The literature on perched water is quite limited and usually this kind of phenomenon is detected and described only on the geological side, but the production behavior of this water is rarely observed. This case study is integrating the geological and geophysical knowledge of the field with production data analysis to understand perched water behavior and can be considered a reference for other similar situation.
This study examines which is the margin of usability for Artificial Intelligence (AI) algorithms related to the rock properties distribution in static modeling. This novel method shows a forward modeling approach using neural networks and genetic algorithms to optimize correlation patterns among seismic traces of stack volumes and well rock properties. Once a set of nonlinear functions is optimized in the well locations, to correlate seismic traces and rock properties, spatial response is estimated using the seismic volume. This seismic characterization process is directly dependent on the error minimization during the structural seismic interpretation process, as well as, honoring the structural complexity while modeling. Previous points are key elements to obtain an adequate correlation between well data and seismic traces. The joint mechanism of neural networks and genetic algorithms globally optimize the nonlinear functions and its parameters to minimize the cost function. Estimated objective function correlates well rock properties with seismic stack data. This mechanism is applied to real data, within a high structural complexity and several wells. As an output, calibrated petrophysical time volumes in the interval of interest are obtained. Properties are used initially to generate a geological facies model. Subsequently, facies and seismic properties are used for the three-dimensional distribution of petrophysical properties such as: rock type, porosity, clay volume and permeability. Therefore, artificial intelligence algorithms can be widely exploited for uncertainty reduction within the rock property spatial estimation.
This challenging reservoir characterization case study is defined by the interaction between two reservoirs with different production mechanisms: a fractured basement reservoir and an overlying sandstone reservoir. The existing static geologic concept has been significantly enhanced by integrating pressure data from a unique three-year shut-in period to aid modeling of fractured reservoir connectivity. Previously, the seismic dataset was predominantly used to model the fault and fracture network and guide well planning. In the current approach, the full field data set, including all drilling parameters and new reservoir surveillance data were integrated to address uncertainty in the connected hydrocarbon volume and the relative importance of each production mechanism. The result is a reservoir management tool with which to test re-development concepts and effectively manage pressure decline and increasing gas/oil ratio (GOR) and water production.
To achieve a fully integrated history matched model, the first step was to make a thorough review of the existing detailed seismic interpretation, vintage production logging tool runs (PLT's), wireline logs (including borehole image logs (BHI)) and drilling data to find a causal link between hydraulically conductive fractures and well production behavior. In parallel, a material balance exercise was run to incorporate the new pressure data acquired during the field's shut-in period. The results of the material balance analysis were combined with seismic and well data to define the distribution of connected fractures across the field. Additionally, the material balance analysis was used to determine the connected hydrocarbon volume, the distribution of initial oil in-place and the relative hydrocarbon contribution from each production mechanism.
The field is covered by multi-azimuth 3D seismic and 43 vertical to highly deviated development wells, providing significant static and dynamic data for characterizing the distribution of connected fractures. Despite this high quality, diverse and field-wide dataset, prior modeling iterations struggled to sufficiently describe the production behavior seen at the well level. This has resulted in a major challenge to predicting the production behavior of new development wells and planning for reservoir management challenges. Capturing the complex interaction between production variables (including lithology, matrix versus fracture network, geomechanical stresses, reservoir damage and pressure depletion) at a field level instead of at an individual well level resulted in a unified static and dynamic model that reconciles all scales of observation.
This oilfield represents a unique reservoir characterization opportunity. The result is a key example of how iterative, integrated geological and engineering driven reservoir modeling can be used to inform the development in a complex, mature field. This case study provides an excellent analogue for the reservoir characterization of other fractured Basement fields and/or Basement-cover reservoir couplet fields in the early to late phases of their development.