Peer Reviewed
Source
Journal
Conference
Publisher
Author
Concept Tag
File Type
The current scheme for developing shale reservoirs necessitates special considerations while estimating the reserve. While reservoir characteristics lead to an extended infinite acting flow regime, completion schemes could result in a series of linear flows. Therefore, the initial linear flow does not have to be followed by a boundary-dominated flow. Overlooking this observation leads to unphysical Arps’ exponents and overestimations of the Estimated Ultimate Recovery (EUR). We are proposing a workflow to overcome these challenges and honor the inherited uncertainty while using the classic
Amer, AimenAi (Schlumberger) | Sajer, Abdulazziz (Kuwait Oil Company) | Al-Adwani, Talal (Kuwait Oil Company) | Salem, Hanan (Kuwait Oil Company) | Abu-Taleb, Reyad (Kuwait Oil Company) | Abu-Guneej, Ali (Kuwait Oil Company) | Yateem, Ali (Kuwait Oil Company) | Chilumuri, Vishnu (Kuwait Oil Company) | Goyal, Palkesh (Schlumberger) | Devkar, Sambhaji (Schlumberger)
Producing unconventional reservoirs characterized by low porosities and permeabilities during early stages of exploration and field appraisal can be challenging, especially in high temperature and high pressure (HPHT) downhole conditions. In such reservoirs, the natural fracture network can play a significant role in flowing hydrocarbons, increasing the importance of encountering such network by the boreholes.
Consequently, the challenge would be to plan wells through these corridors, which is not always easy. To add to the challenge, well design restrictions dictate, the drilling of only vertical and in minor cases deviated wells. This can reduce the possibility of drilling through sub-vertical fracture sets significantly, and once seismic resolution is considered, it may seem that all odds are agents encountering a fracture network.
This article addresses a case where a vertical well is drilled, in the above-mentioned reservoir setting, and missed the natural fracture system. The correct mitigation can make a difference between plugging and abandoning the well or putting it on production.
The technique utilized is based on a borehole acoustic reflection survey (BARS) acquired over a vertical well to give a detailed insight on the fracture network 120 ft away from the borehole. Integrating this technique with core and high-resolution borehole image logs rendered an excellent match, increasing the confidence level in the acoustically predicted fracture corridors.
Based on these findings new perforation intervals and hydraulic stimulation are proposed to optimize well performance. Such application can reverse the well decommissioning process, opening new opportunities for the rejuvenation of older wells.
Skalinski, Mark (retired Chevron ETC) | Mallan, Robert (Chevron ETC) | Edwards, Mason (Chevron ETC) | Sun, Boqin (Chevron ETC) | Toumelin, Emmanuel (Chevron ETC) | Kelly, Grant (Chevron ETC) | Wushur, Hazaretali (Chevron ETC) | Sullivan, Michael (Chevron Canada Resources)
Abstract
Assessment of the “net pay” is an essential part of reservoir characterization and resource determination. Standard methods usually involve the use of porosity, permeability and water saturation cutoffs to define net reservoir, net pay and perforation zones. However, there are no industry standards for the definition of cutoffs and their application in the reservoir characterization workflows. Assessment of net-pay cutoff s in carbonates is more challenging than in clastics due to inherent heterogeneity of pore architecture and permeability. Historically, the success rate of flowing perforations is low, and operators tend to “overperforate” to capture all potential flowing zones.
This study was undertaken to redefine pay categories and provide methods of cutoff determination in carbonates, leveraging applications of NMR logging, capillary pressure, and in-situ flow measurements. The new category of “gross hydrocarbon” is introduced to describe the rock charged with hydrocarbon. The new methods defining “gross hydrocarbon” are described: NMR shape analysis and hydrocarbon-charged pore-throat (HCPT) or R10 method. NMR T2 shape and 2D shape analyses define the minimum porosity and/or permeability with detectable hydrocarbon signal. The T2 shape analyses were performed for several carbonate fields around the world, yielding a porosity cutoff for hydrocarbon charge varying between 1.5 and 3.5%, depending on reservoir type.
The HCPT or R10 method used an extensive MICP dataset from these carbonate fields to predict an entry pore-throat radius corresponding to potential hydrocarbon charge. The predicted entry pore-throat log combined with the pore-throat size corresponding to capillary pressure at specific height above free-water level (HAFWL) allowed to define zones which were not penetrated by hydrocarbon charge due insufficient capillary pressure. Definition of those zones corroborated results from the NMR shape analysis. Both methods are restricted to hydrocarbon column.
The next cutoff investigated was the minimum value of permeability associated with observed flow of in-situ fluids indicated by wireline pressure test or production logs. This cutoff would correspond to the conventional “net reservoir” definition. The use of permeability mitigates the need for porosity cutoff s, which usually vary by rock type. The study performed in the different carbonate reservoirs yielded permeability cutoff s varying between 0.01and 1 mD.
Practical examples from Tengiz, Karachaganak, PZ, West Africa and Permian basin validate the consistency between methods and the validity of statistical predictions of R10 pore throat. The methods presented here can be applied to any conventional reservoir.
Li, Boxiao (Chevron Energy Technology Company) | Bhark, Eric W. (Chevron Asia Pacific E&P Company) | Gross, Stephen J. (Chevron Energy Technology Company) | Billiter, Travis C. (Chevron Energy Technology Company) | Dehghani, Kaveh (Chevron Energy Technology Company)
Assisted history matching (AHM) using Design of Experiments (DoE) is one of the most commonly applied history matching techniques in the oil and gas industry. When applied properly, this stochastic method finds a representative ensemble of history-matched reservoir models for probabilistic uncertainty analysis of production forecast. Although DoE-based AHM is straightforward in concept, it can be misused in practice because the workflow involves many statistical and modeling principles that should be followed rigorously. In this paper, the entire DoE-based AHM workflow is demonstrated in a coherent and comprehensive case study that is divided in seven key stages: problem framing, sensitivity analysis, proxy building, Monte-Carlo simulation, history-match filtering, production forecast, and representative model selection. Best practices of each stage are summarized to help reservoir management (RM) engineers understand and apply this powerful workflow for reliable history matching and probabilistic production forecasting. One major difficulty in any history matching method is to define the history-match tolerance, which reflects the engineer's comfort level of calling a reservoir model "history-matched" even though the difference between simulated and observed production data is not zero. It is a compromise to the intrinsic and unavoidable imperfectness of reservoir model construction, data measurement, and proxy creation. A practical procedure is provided to help engineers define the history-match tolerance considering the model, data-measurement, and proxy errors.
ABSTRACT
Assessment of net pay cutoffs in carbonates is more challenging than in clastics due to inherent heterogeneity of pore architecture and permeability. Historically, the success rate of flowing perforations is low, and operators tend to “over-perforate” to capture all potential flowing zones. Asset teams must assign net thicknesses for modeling and resources assessment. Simple porosity cutoffs which might be adequate for sandstones often fail in complex carbonates. This study was undertaken to assess definitions of cutoffs in carbonates, leveraging applications of NMR logging, capillary pressure, and in-situ flow measurements.
First, we looked at the cutoffs defining hydrocarbon charge into the pore system. Proper determination of this cutoff can help better estimate hydrocarbon in place. To address this question, we have developed NMR T2 Shape and 2D Shape analyses to define the minimum porosity and/or permeability with detectable hydrocarbon signal. The T2 shape analyses were performed for several carbonate fields around the world, yielding porosity cutoff for hydrocarbon charge varying between 1.5 and 3.5%, depending on reservoir type.
Second, extensive MICP data from these carbonate fields were used to predict an entry pore throat radius corresponding to potential hydrocarbon charge. The predicted entry pore throat log combined with the pore throat size corresponding to capillary pressure at specific height above free water level (HAFWL) allowed to define zones which were not penetrated by hydrocarbon charge due insufficient capillary pressure. Definition of those zones collaborated very well with results from the NMR Shape analysis, extending our ability to define “gross hydrocarbon” for fields without NMR data.
The next cutoff investigated was the minimum value of permeability that correlated with observed flow of in-situ fluids, i.e.: production logs, derivative of temperature logs, and wireline pressure tests. This cutoff would correspond to the conventional “net reservoir” definition. The use of permeability mitigates the need for porosity cutoffs which usually varies by rock types. We have predicted permeability from logs using the K-Nearest Neighbor method which reconstructs well the core permeability distribution. The study performed in the different carbonate reservoirs yielded permeability cutoffs varying between 0.01and 0.1 mD.
This approach allowed us to define a set of recommendations for definitions of net reservoir and net pay, and to provide a practical methodology to assess hydrocarbon potential. The methods presented here can be applied to any conventional reservoirs.
Tengizchevroil (TCO) is the biggest operator in Kazakhstan developing two world's deepest supergiant oilfields - Tengiz and, its satellite field, Korolev. With over 20 years of oil production at TCO, reservoir pressure has been declining and is approaching bubble point pressure. In order to arrest the declining pressure trend and extend oil production plateau, TCO is evaluating Improved Oil Recovery (IOR) opportunities, including potential Waterflood in Korolev field.
Accurate Waterflood evaluation requires improved characterization of the main uncertainties impacting ultimate recovery under IOR processes. Therefore, we built next-generation Korolev reservoir model (SIM15K) which incorporates results of the latest characterization efforts based on the latest wide- azimuth 3D seismic survey. This work led to updated Korolev depositional model, which helps to understand the links between geological settings and fracture occurrence. In conjunction with the first implementation of Dynamic Data Integration workflow, this resulted into updated Low-Mid-High fracture models - one of the main factors controlling Waterflood performance in naturally-fractured reservoirs.
This paper focuses on Brownfield Experimental Design (ED) of Korolev field, which is specifically designed to provide an estimate of IOR Incremental Recovery. We identified 23 main uncertainty parameters for each Low-Mid-High Fracture models. The Brownfield ED was run with two development scenarios: Primary Depletion and Waterflood to get probabilistic assessment of Incremental Waterflood Recovery. Overall 803 cases were required for each fracture model and development scenario to generate good quality proxies for cumulative recoveries and History-Match error. Those proxies were used to sample the entire space of uncertainties and define P10/50/90 targets.
As a result of robust Brownfield ED, we selected P10/50/90 models to capture both range in Incremental Waterflood Recovery and Ultimate Recovery under Primary Depletion. The underlying uncertainty parameters for the final model selection were picked based on their relative impact on the objective functions. Currently, the new SIM15K model is being used for Korolev Waterflood evaluation and optimization, Reserves estimation, existing infrastructure optimization and future projects design.
Almubarak, Tariq (Texas A&M University) | Ng, Jun Hong (Texas A&M University) | Nasr-El-Din, Hisham (Texas A&M University)
This literature review summarizes the theory and application of chelating agents in acidizing both carbonate and sandstone formations, and in hydraulic fracturing. The objective of this work is to explain the key role that chelating agents play in stimulation.
This paper reviews results that were obtained through various laboratory tests, which aid in understanding chelating agent interactions with formation rock and fluids. Results were obtained through the following tests: coreflooding, corrosion tests, compatibility, Inductively Coupled Plasma (ICP), Environmental Scanning Electron Microscope (ESEM) and X-Ray Diffraction (XRD).
At high temperatures, conventional acids such as HCl show severe corrosion, lack of penetration and sludging characteristics. Several organic acids were proposed in the reviewed literature to solve these issues. However, even organic acids result in solubility and incompatibility issues. Based on these shortcomings, chelating agents are often used and show good dissolving power, low corrosion, low sludging tendencies, excellent iron control, and some are highly degradable and environmentally friendly. In addition to acidizing, chelating agents allow for the application of saline water fracturing because of their water-softening properties. They also contribute to high-temperature water fracturing through delayed chelation of both the crosslinker and breaker.
This work summarizes the applications of chelating agents in the stimulation sector. Over 100 papers were reviewed, including the latest developments and field applications of this technology. Readers can easily expand on this paper to further explore the wide range of applications chelating agents can offer the oil and gas industry.
The main objective of this study was to build a 3D geological model that will allow geologists to be able to better understand facies distribution based on a deep comprehension of depositional model concepts. The goal is to allow the geologist to walk into and traverse in real time such model. This may seem out of the norm, but by the end of this paper you will be able to appreciate this new technique.
The case study that is presented in this article is focused on the Ferron Sandstone Member exposed at the Muddy Creek Canyon located in Utah, USA. Seven vertical wells have been drilled, cored and logged behind the valley walls. Beside basic openhole logs, two wells have been logged with borehole image logs. Detailed outcrop traverses have been performed along the Muddy Creek Canyon area and surface topography along with subsurface facies variations has been captured and integrated with modern day fluvial systems observations from Trinity River, Texas, USA. The latter has served as a modern day analog for parts of our model.
The modeled facies range from deep marine shales, shoreface deposits and delta deposits, all the way to fluvial, swamp and lagoon facies. Sea level fluctuations have been captured and modeled in a sequence stratigraphic fashion.
The model that is produced does not just integration surface and subsurface data into a single 3D geological model, but it also incorporates the concept of sequence stratigraphy and sea level fluctuations. Such approach allowed the comparison between the 3D model, the actual outcrops and modern depositional processes. A number of iterations had to be performed on the model to reach to an actual representation of the outcrops and subsurface data. This was done by taking the model to the outcrop in order to quality check the actual distribution of facies, hence "walking in the model".
The value of doing such exercise is to develop a deeper understanding on reservoir distributions, lateral changes in facies and depositional environment variations, in the subsurface and apply these learnt concepts on oil fields. By doing so, the geologist has a better chance in reducing uncertainties associated with reservoir, seal and source rock distribution, increasing the change of finding hydrocarbon.
Goodwin, N. (Essence Products and Services Ltd.) | Esler, K. (Stone Ridge Technology) | Ghasemi, M. (Stone Ridge Technology) | Mukundakrishnan, K. (Stone Ridge Technology) | Wang, H. (iReservoir.com, Inc.) | Gilman, J. R. (iReservoir.com, Inc.) | Lee, B. (Memorial Resource Development Corp.)
History matching large, complex fields has remained time consuming, and valid probabilistic uncertainty quantification has been a distant goal. Significant advances have been made recently in the development of improved sampling algorithms, efficient and accurate proxy models, and extremely high-performance forward simulation using GPUs and scalable algorithms. Combining these, we present a highly efficient workflow for robust probabilistic forecasts of a real field model with detailed reservoir characterization and over thirty years of production history.
The model used here describes a highly faulted offshore field produced from two platforms with 11 distinct sand sub-intervals and 22 major faults. Geostatistical methods were used to estimate each reservoir's static property distribution based on well logs. This static 3D model was upscaled and combined with reservoir engineering data and historical-interpretations to provide initial fluid distribution for use in flow simulation history matching and forecasts. Hamiltonian Markov Chain Monte Carlo techniques coupled with an efficient implementation of proxy models are used for uncertainty quantifications. A fast, highly efficient and scalable GPU-based reservoir simulator is employed for all forward simulations.
The complexity of the geological properties of the reservoir requires a large number of cells to adequately describe the dynamics of production, and many adjustable parameters are needed to characterize the uncertainties in the model. While the use of advanced sampling and proxies vastly reduces the number of detailed flow simulations needed, hundreds of forward simulations are still required to calibrate the proxies and production forecasts. The bandwidth and computational throughput provided by GPUs allow such simulations to be performed extremely fast using only a modest amount of hardware. Through this case study, we show that by coupling an advanced history matching, prediction and optimization tool with a fast GPU reservoir simulator, accurate probabilistic uncertainty quantification can be made practical even for large models.
The workflow presented here can potentially transform the way probabilistic uncertainty quantifications are performed on large real field models. It will lay the foundation for the next generation of fast and accurate uncertainty tools and workflows.
Kumar, Sarwesh (Chevron Corporation) | Wen, Xian-Huan (Chevron Corporation) | He, Jincong (Chevron Corporation) | Lin, Wenjuan (Chevron Corporation) | Yardumian, Hrant (Chevron Corporation) | Fahruri, Irvan (Chevron Corporation) | Zhang, Yanfen (Chevron Corporation) | Orribo, Jose M. (Chevron Corporation) | Ghomian, Yousef (Chevron Corporation) | Marchiano, Iryna Petrovska (Chevron Corporation) | Babafemi, Ayanbule (Chevron Corporation)
Reservoir simulation is a widely accepted tool for assessing the impact of uncertainties on upstream investment decisions. Currently, the most widely used workflow addressing these uncertainties is a traditional two-step approach: 1) geoscientists performing static uncertainty analysis with earth modeling parameters and selecting a few representative geological models (for example, low-mid-high); 2) reservoir simulation engineers conducting dynamic uncertainty analysis with dynamic parameters combined with the pre-selected geological models and performing history matching, forecasting, or optimization. In this workflow, all the geological uncertainties are lumped into one parameter (the grid) for use in the second step. This severely reduces the flexibility for considering a wider range of alternative static realizations, and thus may bias the history match and forecasts. We implemented an integrated workflow, called "big-loop" that unifies the two-step approach into a single step. This allows for simultaneous and explicit analysis of both types of uncertainties and improvement in reservoir management decision quality. It also allows for direct modification of earth model parameters to achieve a history match with geological consistency. Although the concept is not new to the industry, it is rare to find references of field applications of the "big-loop" workflow. We present the applications of this workflow to both green and brown reservoirs to demonstrate its value in improving accuracy and efficiency. In a Gulf of Mexico green field, the workflow is applied for uncertainty analysis of static parameters (for example sand channel width and salt body extension) and dynamic parameters (for example rock-fluid properties) for probabilistic Original Oil in Place (OOIP) assessment and production forecast. The workflow facilitates the design of uncertainty resolution, upside capture and downside mitigation plans. In an onshore fractured reservoir, the workflow is applied for simultaneous history matching using static fracture parameters (fracture length and aperture) and dynamic parameters. The workflow improves the model accuracy and decision quality for the upcoming IOR/EOR development project. In an offshore gas field, the workflow is used to perform experimental design (ED) studies with static and dynamic uncertainties. This systematic & automatic workflow eliminates manual inputs and reduces the need for recycles. Finally, in another field, the workflow is used to perform probabilistic history matching using static and dynamic parameters. This workflow is capable of delivering a full-cycle solution for uncertainty assessment and probabilistic history matching with high efficiency and high quality results.