An important aspect of reservoir management process is monitoring and revising plans and an essential component of reservoir management strategy is integration of technologies (Satter et al. 1994). In revisions of reservoir management plans, very rarely do operators incorporate any other data than the data from newly drilled wells and the field production in between the revisions. Any inference in the inter-well space is an interpolation between the well data, as surface seismic is limited by its resolution, as the calibration data is available only at well level. For an effective reservoir management, especially in the decline phase of the field, a logical integration of technologies to capture maximum heterogeneity in the interwell space can be very advantageous. Crosswell technologies, that provide high resolution data between a set of wells, but if used individually, are essentially limited to a very small part of the field having multiple wells. Therefore, in order to monitor and revise reservoir management plans, it is important that such technologies are integrated with full-field solutions. This paper describes a methodology that aims to better manage reservoir by logically integration the 3D reservoir model-a full-field solution-and crosswell electromagnetics/crosswell seismic-an interwell solution.
Nandi Formentin, Helena (Durham University and University of Campinas) | Vernon, Ian (Durham University) | Avansi, Guilherme Daniel (University of Campinas) | Caiado, Camila (Durham University) | Maschio, Célio (University of Campinas) | Goldstein, Michael (Durham University) | Schiozer, Denis José (University of Campinas)
Reservoir simulation models incorporate physical laws and reservoir characteristics. They represent our understanding of sub-surface structures based on the available information. Emulators are statistical representations of simulation models, offering fast evaluations of a sufficiently large number of reservoir scenarios, to enable a full uncertainty analysis. Bayesian History Matching (BHM) aims to find the range of reservoir scenarios that are consistent with the historical data, in order to provide comprehensive evaluation of reservoir performance and consistent, unbiased predictions incorporating realistic levels of uncertainty, required for full asset management. We describe a systematic approach for uncertainty quantification that combines reservoir simulation and emulation techniques within a coherent Bayesian framework for uncertainty quantification.
Our systematic procedure is an alternative and more rigorous tool for reservoir studies dealing with probabilistic uncertainty reduction. It comprises the design of sets of simulation scenarios to facilitate the construction of emulators, capable of accurately mimicking the simulator with known levels of uncertainty. Emulators can be used to accelerate the steps requiring large numbers of evaluations of the input space in order to be valid from a statistical perspective. Via implausibility measures, we compare emulated outputs with historical data incorporating major process uncertainties. Then, we iteratively identify regions of input parameter space unlikely to provide acceptable matches, performing more runs and reconstructing more accurate emulators at each wave, an approach that benefits from several efficiency improvements. We provide a workflow covering each stage of this procedure.
The procedure was applied to reduce uncertainty in a complex reservoir case study with 25 injection and production wells. The case study contains 26 uncertain attributes representing petrophysical, rock-fluid and fluid properties. We selected phases of evaluation considering specific events during the reservoir management, improving the efficiency of simulation resources use. We identified and addressed data patterns untracked in previous studies: simulator targets,
We advance the applicability of Bayesian History Matching for reservoir studies with four deliveries: (a) a general workflow for systematic BHM, (b) the use of phases to progressively evaluate the historical data; and (c) the integration of two-class emulators in the BHM formulation. Finally, we demonstrate the internal discrepancy as a source of error in the reservoir model.
Barmer Hill Turbidites (BHT) are low permeability reservoirs in the Vijaya & Vandana field with an approximate in place reserve of a billion barrels. The field was discovered in 2004 with the discovery wells V-1 and V-2 respectively. Post drilling and completion these wells were tested without any stimulation technique, resulting in ~ 25 – 50 BOPD flow owing to tight nature of these formations. Subsequently the zones were hydraulically fractured and tested resulting in ~ 10 – 12 folds increase in the production rate of the oil. Also, the testing of multiple stacked reservoirs in these two wells further confirmed BHT-10 to be the most prolific zone in terms of commercial flow rates achievable. Apart from being tight formations, the low net to gross on reservoirs (<20%) further added to the challenges of devising a strategy to make these reservoirs flow at sustained commercial oil rates. Hence, when the field was taken for the next stage of a hydrocarbon field lifecycle i.e. the appraisal campaign, two very clear objectives were identified for achieving a successful appraisal campaign viz. hydraulically frac and test two of the existing wells in the field while aiming to connect the maximum available KH and ensure effective data acquisition through injection tests and temperature logs with an aim to calibrate the existing stress logs and eventually build a robust frac model.
The dynamic geo-mechanical parameters i.e. Young’s Modulus and Poisson’s Ration were calculated from the open hole sonic logs and were converted to static data using the lab measured value from the core tests. Stress logs generated from these static data points were used for the initial frac designing in the wells. During the execution phase of the frac campaign, at every opportunity available, injection tests were carried out and fall off data were acquired to estimate the closure pressures actually observed in these zones. Post acquiring the measured stress data, the earlier calculated stress logs were calibrated using these measured closure points (frac gradients) by incorporating the stress components due to strain factors (ɛmin & ɛmax) in both max and min direction of the principle stresses.
Post every data injection, temperature logs were also acquired. This gave a better control on frac height (hydraulic height) based on the cool downs observed on the temperature logs. This proved to be a very important data set in comparing the height predicted by the calibrated stress logs versus the height estimated from the temperature log cool downs. This step helped in gaining confidence on the model predictability. This also helped in real time frac design optimization and placement of perforation intervals for the main frac designs. Further, the entire model calibration exercise also helped in arriving at a porosity based leak off equation.
The paper endeavors to discuss in detail the entire workflow used during this appraisal campaign to arrive at a calibrated and a robust frac model whilst showcasing the journey taken from 50 BOPD to 500 BOPD in these tight oil sands to achieve ~ 10 fold production increase. Authors, further, emphasize on the importance of carrying out such data acquisitions during the appraisal phase of a field to gain better control on the models. This paper will also elaborate on the strategy deployed for these data acquisition to optimize the fracs in real time and to integrate different data sets for calibrating the geo-mechanical and frac simulation models.
Until recently, reservoir characterization methods in the industry were limited to use of seismic technologies in exploration of oil and gas and had a very constrained role in production and development. In the past, using characterization for development fields was considered a very perilous task. Technological advancements and the risk-averse mindset have significantly expanded the application of reservoir characterization. Today, reservoir characterization is the basis of any development plans made for a commercial field.
Development of 3D reservoir modeling techniques to generate field development plans (FDPs) marked a step-change in reservoir characterization methods. Introduction of geostatistics and numerical simulation made it possible to build precise models to generate realistic field development scenarios. This is the state-of-the-art seismic-to-simulation method of reservoir characterization used in FDPs today. However, the struggle to estimate reservoir properties spatially away from the well continues.
Surface seismic data provide excellent areal coverage but do not provide the vertical resolution required for a fine-scale reservoir model. Geostatistical methods reduce the uncertainty in spatial distribution of petrophysical properties from pseudo-point supports (wells) but are not calibrated spatially between the wells. Correspondingly, the fluid saturation distribution and the parameters used in dynamically calculating the same during numerical simulation are not calibrated in the interwell space.
This paper details necessary data acquisitions and methods of calibration of 3D reservoir model to reduce uncertainty in the interwell space. The data acquisition methods have been available for some time, but have rarely been electronically incorporated in the 3D reservoir model and have been largely used to analytically guide the modeling and its inferences. A logical way of interpreting the results of acquisitions and calibrating the 3D reservoir model cell-by-cell is detailed in this paper.
Immiscible water-alternating-gas (iWAG) flooding is often considered as a tertiary recovery technique in waterflooded or about-to-be waterflooded reservoirs to increase oil recovery due to better mobility control and potentially favorable hysteretic changes to phase relative permeabilities. In such cases, typically, reservoir simulation models already exist and have been calibrated, often modifying saturation functions during the history matching stage. However, to utilize such models in forecasting iWAG performance, additional parameters may be required. These can be acquired by simulation of WAG coreflood experiments. While in many published cases, the parameter values obtained from matching experimental results are used without modification, this may not be advisable since the parameters are only valid at the core scale at which they were obtained. This paper discusses the challenge of systematically upscaling WAG parameters obtained at core scale to an existing full field model.
In this work, we use a multi-stage upscaling process from core scale to full field scale. The first stage uses a core scale model to match ‘representative’ core flood experiments and obtain WAG parameters. The second uses a well-to-well high-resolution 1D section of the full field model populated using gridblocks of core size to generate ‘reference’ WAG performance using the unaltered WAG parameters obtained from core. The third stage uses a similar 1D model but populated using gridblocks at full field model resolution to match the results from the reference model while adjusting the WAG parameters as little as possible. Finally, a model using the full field model resolution as well as the full field relative permeability functions which, it is assumed, have been tuned to match the history and account for dispersion is used to match the reference model results and obtain final upscaled WAG parameters.
The upscaled WAG parameters obtained at the end of this multi-stage process can be used at the field scale. This process allows clear quantification of the uncertainty associated with the upscaling process. Simulations at the third stage showed that once the full field to core scale grid size ratio exceeded a certain point (2500:1), there was a marked increase in the difference between upscaled and reference model results. It was found that if WAG parameters were changed in the full field model resolution model in order to match recovery results in the reference model, Land's parameter could change by up to 10% and relative permeability reduction factor could increase by up to 30% although it is expected that this will vary from case to case. It is therefore recommended to identify and use full field model resolutions to as close to the threshold as possible. The practice of using the core scale iWAG parameters in the full field model directly could under-estimate actual recovery, and overestimate injectivity. When considering the WAG mechanism alone, the value of the recovery underestimate increasing with pore volumes injected and, in our case, by up to 7% after injecting 1 pore volume of fluid.
This multi-stage simulation approach helps identify the adjustments required and uncertainties associated with simulating iWAG flooding in reservoir models. This approach utilizes options widely present in commercially available finite difference simulators, addresses the challenge of utilizing existing pseudo functions and provides a practical methodology through which iWAG performance forecasting can be improved.
Seismic attributes play an important role during reservoir characterization and three-dimensional (3D) lithofacies modeling by providing indirect insight of the subsurface. Using seismic attributes for such studies has always been challenging because it is difficult to determine a realistic relationship between hard data points (i.e., well information) and a 3D volume of seismic attributes. However, a probability-based approach for 3D seismic attribute calibration with well data provides better results of lithofacies modeling and spatial distribution of reservoir properties. This paper presents a probability-based seismic attribute calibration technique that has been described for 3D lithofacies modeling and distribution. This approach helps in subsurface reservoir characterization and provides a realistic lithofacies distribution model. This approach also helps reduce uncertainty of lithofacies prediction compared to conventional methods of simply using geostatistical algorithms.
In-situ saturation monitoring (ISSM), using X-rays or gamma rays, has become a common method to determine fluid saturations in commercial coreflood experiments. The most common method in commercial laboratories entails 1D saturation measurements as a function of core-plug length and of experimental time. Laboratories often employ ISSM as the only method of determining fluid saturations, assuming an almost infallible accuracy of 1 to 2 saturation units (s.u.). However, as for all measurement methods, there are possible sources of uncertainty in ISSM data. Previous papers have discussed some of these uncertainties, such as X-ray drift, and inappropriate calibration scans or changes to core or fluid properties during testing. Despite this evidence, some laboratories continue to use ISSM measurements alone, assuming negligible uncertainty.
In the authors’ experience, uncertainties not only exist in measurement errors, but also may be introduced by inappropriate processing and interpretation methods. This paper first considers the stipulated 1 to 2 s.u. accuracy and the necessary signal-to-noise ratio, i.e., counts required, to achieve this; as well as providing a suggested approach, where plausible, to correct saturation data compromised by incorrect calibration scans. It also considers the uncertainties in use of ISSM production volumes in determining unsteady-state relative permeability; specifically, pre- and post-breakthrough data and the assumptions surrounding selection of breakthrough from flood-front scans. In addition, ISSM profiles are often used in coreflood simulation of relative permeability to aid correlation of the capillary end effect; incorrect data processing may compromise this correlation. The paper considers several sources of error in ISSM data and provides a recommended approach to acquisition, processing and interpretation of ISSM data for calculation of fluid saturations.
Lu, Jian (Shell International Exploration and Production Inc) | Toups, Dale (Shell Exploration & Production Company) | Lamoureux, Burton (Shell International Exploration and Production Inc) | Williams, Stephen (Malvern Panalytical) | Williams, Joshua (Baker Hughes, a GE company)
Water, oil and solid field sample characterizations are essential to scale management, corrosion and flow assurance surveillance. From sample collection to getting lab test results take weeks to even months for off-shore locations, while operation changes can happen in hours or days. During the sample transportation process, water and solid samples are often oxidized with iron species dropped out of solution or changed to oxide. For fast operational feedback and "freshest" sample measurement, on-site composition analyses are highly desirable. Typical lab analyzers, such as ICP (inductively coupled plasma) and IC (Ion Chromatography), are highly specialized and requires regular chemical supplies and maintenance. So many lab analyzers are not suitable for on-site use.
This paper reports the development of test methods using a benchtop X-Ray Fluorescence (XRF) analyzer for oil field samples and field application at Gulf of Mexico offshore locations. The Benchtop XRF analyzer is very user-friendly, requires minimal sample preparation, and leaves little room for human error. Once set up, the analyzer provides fast on-site feedback at low cost, and can work with all non-gas samples. With calibrated methods, this analyzer can provide quantitative measurement for elements in water or oil. For other sample types, such as solid, slurry, mix and metals, this analyzer can be used to do qualitative measurements for trending and component identification.
This on-site surveillance tool has proven to be able to provide fast and accurate data on key elements for scale, corrosion and flow assurance management at a low cost. Examples of operation decisions based on this analyzer results will be presented. This tool has demonstrated the ability to provide timely data for preventing plugging/fouling, checking chemical effectiveness, improved integrity surveillance and well flowback surveillance. Use of this tool during maintenance/turnaround helps to build up a better picture on areas with various deposits.
Inflow Control Devices (ICDs) are being increasingly used in complex, heterogeneous reservoirs to make the inflow profile more uniform, delay breakthrough of water and/or gas and limit differential depletion, which can lead to crossflow and other detrimental phenomena. However, ICDs not only alter inflow behaviour: they also affect outflow of fluid during chemical treatments, such as scale squeezes, stimulation,
Methods to account for the additional flow resistance from ICDs when predicting placement of bullheaded treatments are discussed in this paper, in particular, to evaluate whether a theoretical approach based upon Bernoulli's Theorem leads to sufficiently accurate predictions in the absence of laboratory correlations between pressure drop across the ICD and flow rate. This approach may also become significant where the laboratory calibration might be expected to have changed during well life, such as, under the influence of erosion.
The paper describes two analytical methods of simulating placement in a multi-zone well in a heterogeneous reservoir in the Middle East: the first is empirical and models the pressure drop using an equation derived from calibration data in the laboratory; the second uses the Bernoulli equation, and is theoretical. For the empirical approach, the laboratory-based pressure-drop/flowrate calibration data were fitted to an equation, with parameters that depended upon the nozzle dimensions. The theoretical approach calculated the pressure drop using the Bernoulli equation for a cylindrical ICD nozzle. Both methods were used to simulate placement of a generic scale-inhibitor squeeze treatment and the corresponding chemical returns for each zone in the well. In general, the differences in the predictions between the two models were found to be very minor, showing that a theoretical approach is sufficiently accurate to design and evaluate chemical treatments in wells fitted with ICDs in most cases.
This means a very rapid analytical approach can be used to design and evaluate near-wellbore treatments in such wells without resorting to much more complex, numerical-based reservoir simulators, even when calibration data about the ICD performance are not available.
Hydraulic fracture calibration in an unconventional environment is a complex process and is inconsistently practiced. Automated calibration methods are not effective or efficient in accounting for the heterogeneity and variation of constraining parameters. However, it is important to build a consistent methodology to calibrate hydraulic fractures incorporating the observed data. This paper covers the systematic "Seismic to Simulation" workflow for unconventional reservoirs to constrain a hydraulic fracture model to obtain a calibrated result.
For the hydraulic fracture calibration, injection fall-off tests, sonic logs and image logs are commonly used as the primary inputs to calibrate the geomechanical model. A new workflow is developed to be used consistently incorporating the learnings from the traditional fracture calibration methods. Impact of high stress barriers and height and pinchouts of fractures are incorporated in a geomechanical-flow model. Simultaneous matching of the observed net pressure trend, incorporating the effect of reservoir laminations on fracture height growth is made using a complex fracture model. The effect of the natural fracture networks (NFN) on pressure losses and proppant transport is also accounted for in the fracture geometry. Further, hydraulic fracture geometry is calibrated using the microseismic data. The production behavior was validated using numerical simulation for production history matching.
A case study from the Permian basin is considered for the paper. The fracture geometry and footprint obtained using the calibration workflow match very closely the observed surface and downhole measurements. We constrained the model by matching the net pressures and achieved simulated production to match within 10% error compared to the actual oil and gas production. The fracture geometry was calibrated using microseismic data and controlled by incorporating the effect of weak interfaces and laminations. This workflow successfully demonstrates hydraulic fracture model calibration using pressure matching, microseismic data and production history matching. Systematically and consistently using this workflow provides solutions for infill well planning and well spacing for asset optimization.
This paper explains a systematic fracture calibration procedure that can be easily adopted by the operators to obtain reliable results in unconventional wells. The effect of reservoir laminations and impact of natural fracture in calibrating the fracture geometry and fracture pressure trend is uniquely demonstrated in this study.