|Theme||Visible||Selectable||Appearance||Zoom Range (now: 0)|
The Merriam-Webster Dictionary defines simulate as assuming the appearance of without the reality. Simulation of petroleum reservoir performance refers to the construction and operation of a model whose behavior assumes the appearance of actual reservoir behavior. The model itself is either physical (for example, a laboratory sandpack) or mathematical. A mathematical model is a set of equations that, subject to certain assumptions, describes the physical processes active in the reservoir. Although the model itself obviously lacks the reality of the reservoir, the behavior of a valid model simulates--assumes the appearance of--the actual reservoir. The purpose of simulation is estimation of field performance (e.g., oil recovery) under one or more producing schemes. Whereas the field can be produced only once, at considerable expense, a model can be produced or run many times at low expense over a short period of time. Observation of model results that represent different producing ...
The Haft Kel field is located in Iran. Its Asmari reservoir structure is a strongly folded anticline that is 20 miles long by 1.5 to 3 miles wide with an oil column thickness of approximately 2,000 ft. The most probable original oil in place (OOIP) was slightly 7 109 stock tank barrels (STB) with about 200 million STB in the fissures; numerical model history matching resulted in a value of 6.9 109 STB. The matrix block size determined from cores and flowmeter surveys varied from 8 to 14 ft. The numerical simulation model considered matrix permeabilities from 0.05 to 0.8 md.
This field produces from a structure that lies above a deep-seated salt dome (salt has been penetrated at 9,000 ft) and has moderate fault density. A large north/south trending fault divides the field into east and west areas. There is hydraulic communication across the fault. Sands were deposited in aeolian, fluvial, and deltaic environments made up primarily of a meandering, distributary flood plain. Reservoirs are moderate to well sorted; grains are fine to very fine with some interbedded shales. There are 21 mapped producing zones separated by shales within the field but in pressure communication outside the productive limits of the field. The original oil column was 400 ft thick and had an associated gas cap one-third the size of the original oil column. Porosity averages 30%, and permeability varies from 10 to 1500 md.
The term "petrophysics" was coined by G.E. Archie and J.H.M.A. Thomeer in a quiet bistro in The Hague. By their definition, petrophysics is the study of the physical and chemical properties of rocks and their contained fluids. It emphasizes those properties relating to the pore system and its fluid distribution and flow characteristics. These properties and their relationships are used to identify and evaluate hydrocarbon reservoirs, hydrocarbon sources, seals, and aquifers. The petrophysicist provides answer products needed and used by team members, as well as physical and chemical insights needed by other teammates. The reservoir and fluid characteristics to be determined are thickness (bed boundaries), lithology (rock type), porosity, fluid saturations and pressures, fluid identification and characterization, permeability (absolute), and fractional flow (oil, gas, water). It is easy to define these characteristics and to appreciate their part in the assessment of reserves. The difficult part comes in determining their actual value at a level of certainty needed to make economic decisions leading to development and production. The seven characteristics listed are interdependent (i.e., to properly determine porosity from a wireline log, one must know the lithology, fluid saturations, and fluid types). The science of petrophysics is then used to unscramble the hidden world of rock and fluid properties in reservoirs from just below the Earth's surface to ones more than four miles deep. The petrophysicist then takes on many characteristics of the fictional sleuth Sherlock Holmes to extrapolate, from the most meager of clues, the true picture of the subsurface reservoir using dogged determination to wrest all possible information from the available data, all the while enjoying the thrill of the hunt. How does the petrophysicist solve this difficult problem? Archie's general method is to subdivide the problem into smaller segments and iterate using all data until all data agree. One starting point is to determine rock types (petrofacies) wherein we identify pore type, pore size distribution, pore throat type, and pore throat distribution.
This chapter discusses the determination of lithology, net pay, porosity, water saturation, and permeability from wellbore core and log data. The chapter deals with "Development Petrophysics" and emphasizes the integration of core data with log data; the adjustment of core data, when required, to reservoir conditions; and the calibration and regression line-fitting of log data to core data. The goal of the calculations is to use all available data, calibrated to the best standard, to arrive at the most accurate quantitative values of the petrophysical parameters (i.e., lithology, net pay, porosity, water saturation, and permeability). Log analysis, cased-hole formation evaluation, and production logging are not covered here. The following topics are covered in this chapter: petrophysical data sources and databases, lithology determination, net-pay (or pay/nonpay) determination, porosity determination, fluid-contacts identification, water-saturation determination, permeability ...
Abstract The filtrate contamination cleanup time on a complex carbonate well using a traditional wireline formation tester (WFT) tool can vary from a couple of hours to over half a day. The method proposed aims at reducing operational time to collect a low-contamination formation fluid sample by determining regions with a smaller depth of invasion using a forward model simulation that considers static and dynamic formation properties to predict the radial profile of invasion. The mud filtrate invasion process was modeled considering the static and dynamic properties of the near-wellbore region in an industry reference reservoir simulator, and it integrates three mechanisms for fluid flow: Darcy’s law, material balance, and capillary pressure. The physical robustness of the reservoir simulator was united to a data-driven model to reduce the computational cost. This proxy model is based on a trained neural network with a broad range of scenarios to predict the numerical simulation results with high accuracy. The invasion estimation from the model is then used to predict the filtrate cleanup time using an industry consolidated numerical modeling. One of the variables influencing most of the cleanup time is the depth of mud filtrate invasion. Thus, reducing this time is a determinant for the WFT operational efficiency. The model for mud invasion has been successfully tested on a complex carbonate well, and the results for the depth of mud invasion were comparable to the results obtained with a commercial data-driven inversion using multiple resistivity channels. The estimated cleanup time using the results of depth of invasion predicted by the forward model has been compared and matched with real carbonate sampling stations, and there was a high correlation indicating that zones with lower depth of invasion required less cleanup time. Besides, using the history-matched cases, different WFT technologies such as single and radial probes, focused, unfocused, and dual-packer WFT inlets were evaluated, showing a high potential for reduction of operational time when properly planned and selected for the specific type of reservoir. The proposed methodology is a viable method for understanding the clean-up behavior in different reservoir scenarios using different WFT technologies. The innovation of this method relies on the data calibration using basic and advanced petrophysical properties through a data-driven model based on a trained neural network to reduce the uncertainty in the predicted invasion radial profile and the WFT cleanup time. The reliability on the theoretical results was increased using real data calibration, and this calibrated theoretical model has been used to guide the sampling depth selection, saving operational time.
Abstract With this paper, we demonstrate how CoreDNA, a trans-disciplinary suite of high-resolution, non-destructive measurements performed on whole cores at the onset of core analysis programs, helps operation geologists and petrophysicists with an innovative, cost effective and objective way to characterize the reservoir quality of highly laminated hydrocarbon-bearing formations where the standard practice (systematic plugging every foot) fails to provide a correct estimate. The case study focuses on core data from three wells intersecting formations characterized by very thin (millimetre-scale) sand and clay/silt laminations where the resolution of conventional wireline and lab gamma ray logs were not sufficiently sharp for an effective evaluation of reservoir quality. Although a high volume of routine core analysis data was already available for these wells, the remaining uncertainty on reservoir evaluation was deemed high enough by the study team to motivate the acquisition of additional data comprising ultra-high resolution pictures (1.8μm/px) and topographic maps created from micron-accurate laser scans. We explain how continuous profiles of grain size indicators could be used for the prediction of permeability variations across these laminated formations and for the definition of a permeability cut-off for the identification of poor vs good reservoir ratios compatible with the reservoir characteristics. CoreDNA test procedures are specifically designed to greatly accelerate the deliverables of core analysis, so that petrophysical evaluation may start right from the moment cores arrive from the well site, which is usually month before routine core analysis results are known. In the context of this paper, CoreDNA results were confirmed a-posteriori by the permeability measured on plugs samples from the two first wells. In the third well however, some marked differences were observed: although permeability ranges were found similar by the two methods, the distribution of permeability values obtained from routine core analysis conducted according to standard guidelines (one sample per foot) gave a more optimistic picture of permeability (90% rock above the 1mD cut-off) than the alternative approach based on high resolution continuous grain size data (70% rock above the 1mD cut-off). From the above findings, we conclude that a standard 1-ft interval for plug acquisition is not enough to fully characterise the distribution of permeability in highly laminated formations. Alternatively, a continuous profile of permeability index based on high resolution grain size measurements offers a fast and cost-efficient solution to obtain representative reservoir quality data, which enable objective well and reservoir management decisions few days after barrel opening without compromising core integrity for further studies.
Xu, Guoqing (Sinopec Research Institute of Petroleum Engineering (Corresponding author) | Han, Yujiao (email: email@example.com)) | Jiang, Yun (Sinopec Research Institute of Petroleum Engineering) | Shi, Yang (Research Institute of Petroleum Exploration & Development, PetroChina (Corresponding author) | Wang, Mingxian (email: firstname.lastname@example.org)) | Zeng, XingHang (Research Institute of Petroleum Exploration & Development, PetroChina (Corresponding author)
Summary Spontaneous imbibition (SI) is regarded as an effective method to improve the oil recovery in a tight sandstone reservoir, which leads to a significant change in fracturing design and flowback treatment. However, a longtime shut-in period would aggravate the retention of fracturing fluid, which is in contradiction with high production in the field. It is imperative to understand how SI works during shut-in time, so as to maximize the effect of imbibition in oil recovery enhancement. In this study, a series of experiments were conducted to simulate the status of residual oil saturation so that the inner mechanism of imbibition on oil recovery can be investigated. Low-field nuclear magnetic resonance (LF-NMR) was used to provide direct observation of phase changes in different pore sizes. The experimental results show a positive effect of imbibition on residual oil reduction. This phenomenon further elucidates the observations made during the well shut-in, soaking period, and low flowback efficiency. This study aims to understand the mechanism of SI behavior and help to improve the accuracy of production prediction.
Summary A physics-based data-driven model is proposed for forecasting of subsurface energy production. The model fully relies on production data and does not require any in-depth knowledge of reservoir geology or governing physics. In the proposed approach, we use the Delft Advanced Reservoir Terra Simulator (DARTS) as a workhorse for data-driven simulation. DARTS uses an operator-based linearization technique that exploits an abstract interpretation of physics benefiting computational performance. The physics-based datadriven model is trained to fit data increasing the fidelity of the model forecast and reflecting significant changes in reservoir dynamics or physics over its history. The model is examined and validated for both synthetic and real field production data. We demonstrate that the developed approach is capable of providing accurate and reliable production forecast on a daily basis, even if the exact geological information is not available. Introduction Computer technologies are progressing rapidly. Computational capacities that are currently available provide an opportunity for many subsurface applications to perform complex numerical simulations of high-resolution 3D geocellular computer models. Predictions obtained from such models are an important factor governing efficient reservoir management and decision making. The models describe complex geological features through a set of gridblocks and associated rock and fluid properties. However, in many cases, the reliability of geological information is questionable or even not available. Although it is possible to develop a high-fidelity model on a reliable basis of reservoir geology, a high-resolution computer model can exceed a few million blocks and can take hours or even days to simulate. It is still not computationally feasible to perform history matching or reservoir-development optimization at such resolution because it involves a large number of simulation runs. Different methods have been developed to overcome the issue.