The acquisition of gas in mud data while drilling for geological surveillance and safety is an almost universal practice. This source of data is only rarely used for formation evaluation due to the widely accepted presumption that they are unreliable and unrepresentative. Recent developments in the mud logging industry to improve gas data acquisition and analysis has led to the availability of better quality data. Within a joint ELF/ENI-Agip Division research program, a new interpretation method has been developed following the comprehensive analysis and interpretation of gas data from a wide range of wells covering different types of geological, petroleum and drilling environments.
The results, validated by correlation and comparison with other data such as logs, well tests, PVTs etc, enable us to characterise:
porosity variations and permeability barriers,
Seal depth, thickness and efficiency,
Gas diffusion or leakage,
gas/oil and hydrocarbon/water contacts,
vertical changes in fluid over a thick mono-layer pay zone,
vertical fluid differentiation in multi-layer interval,
The comparison between surface gas data, PVT and geochemistry data clearly confirms the consistency between the gas show and the corresponding reservoir fluid composition.
The near real time availability, at no extra acquisition cost, of such data has led to:
the optimisation of future well operations (logging, testing, ....),
a better integration of while drilling data to the well evaluation process,
a significant improvement both in early formation evaluation and reservoir studies especially for the following applications where traditional log analysis often remains inconclusive:
very low porosity reservoirs,
dynamic barriers and seal efficiency,
low resistivity pay,
Examples show both wellsite quicklook with simple lithological and fluid interpretations and more complex reservoir and fluid characterisation applications in varied geographical and geological contexts that demonstrate how GWD data is integrated with more standard data sets.
The measurement of drilling gas data (gas shows) is standard practice during the drilling of Exploration and Development wells.
Continuous gas monitoring sometimes enables us to indicate, in general terms, the presence of hydrocarbon bearing intervals but rarely to define the fluid types (oil, condensate and/or gas, water).
Gas data are at present largely under-utilised because they are considered unreliable and not fully representative of the formation fluids.
There are many reasons for this. On the one hand, poorly established correlations between reservoir fluids and shows at surface. On the other hand, the influence on recorded data of numerous parameters such as formation pressure, mud weight and type, gas trap position in the shaker ditch, mud out temperatures, etc. One reason may be the very low cost of such data, often equated with low value.
Until a few years ago, the analysis performed on gas shows was generally restricted to the use of Pixler and/or Geoservices diagrams (or equivalent), Wetness, Balance, Character and Gas Normalisation (Pixler, 1968 Haworth et al., 1985; Whittaker & Selens, 1987; Wright, 1996).
This paper addresses the problem of real time reservoir management and simulation while drilling. A software tool developed for this purpose is also described and discussed. Examples on use of the software are also presented.
Over the last 10-15 years the petroleum industry has, as all industries, constantly faced, derived and implemented new technologies. It is fair to say that the major breakthrough has been in drilling, and some examples are drilling of advanced wells, intelligent wells with remote zone control, drilling speed improvement due to drilling systems such as Autotrak and drilling bits with extended lifetime. These factors have made optimal well location and completion designs a challenging problem.
At the same time, R&D in other disciplines has led to improvements in seismic methods and 3D reservoir modeling methods to mention a few. Also, technology push on other areas has faced the industry with advanced visualization concepts, the possibility to use high speed data networks between main office and onshore and offshore sites and of course the use of internet as an information database.
The challenge is now to take advantage of the new technology and to optimize existing work processes. The cost and risk associated with drilling a new well is so high that all possible information and tools should be used to optimize well location.
But true real time well planning is a fight against time, as drilling speed is one of the technologies that are under constant improvement. Another challenge is how to interpret the new data, sort out the important new information and use it in an optimal way. In this respect, efficient and reliable computer tools are needed.
Time is a key factor for decision while drilling, and the time frame to disposal for the decision makers may be quite different, dependent upon the situation. If a pilot hole is drilled, one has in the order of 1-3 days to use the new data and optimize the final well location and trajectory. If a multilateral is drilled, one has maybe 1-2 weeks in optimizing the second branch based on data from the first. The most challenging problem is to modify (geostear) the well trajectory as the well is drilled. Within the area of reservoir simulation, this challenge is often referred to as simulation while drilling (SWD). We will define simulation while drilling as any computer model based software that may help to optimize the problem together with the work process of actually performing these calculations (i.e. data interpretation, model updating and simulation).
The work presented in this paper started out in 1997 as a development project between BP (now BP/Amoco), Schlumberger GeoQuest, Norsk Hydro and Saudi Aramco. The idea was to develop a software tool (Near wellbore modeling tool) for optimal well location, completion and modeling of near well bore phenomena. The topic of this paper is simulation while drilling and to our knowledge, few papers in the petroleum literature about this concept exist. In an ongoing research programme5 at Stanford University (Supri-B program) both analytical and numerical approaches will be considered. A semi-analytical solution that accounts approximately for reservoir heterogeneity will be considered as a base for SWD in the first part of the project. In the second part of the project, the idea is to use SWD with a numerical simulator. Experience with using PC-based reservoir modeling tools on-site is described and studied by Buchwalter et. al.6
The need for dynamic evaluation is case dependent. In simple (relatively spoken) cases (for instance: two phase flow, close to homogeneous reservoirs, layered models with no dip) analytical methods or simply field experience from drilling similar wells may be sufficient. But in the complicated three phase case with heterogeneities, influence of neighboring wells (infill drilling) and gas and/or water influx, simulation is considered to be the best alternative.
Brage field offshore Norway is producing off plateau at 8500 Sm3/d oil. Associated daily produced water rates are in excess of 16000 m3.
Following a change in injection scheme from seawater to aquifer water in 1997, two water supply wells were drilled and completed in the Utsira reservoir. Artifical lift from Utsira was achieved by installation of Electric Submersible Pumps. In parallel, the topside system was upgraded to facilitate complete PWRI (Produced Water Reinjection) in the Statfjord sandstone reservoir.
The Utsira aquifer, located 600 m below seabed, has showed excellent properties with respect to productivity and deliverability. ESP equipment installed in A-24 and A-29 has had a continuous improvement in reliability since 1998.
Topside, PWRI has resulted in wear on the injection pumps as a consequence of high particle content. Corrosion has lately been detected in the Utsira water producers, injection flowlines and 3 injection well tubings.
Sulphate scaling resulting from early seawater injection has been moderate at Brage. No problems associated with H2S have been seen following the change in water management philosophy. Carbonate scale in the upper tubing is successfully removed by a new acid deflection tool run on wireline while bullheading HCl. Bullhead squeeze treatments is utilized to control further scale buildup.
The closed loop approach to water handling at Brage has proven environmentally favourable, and has prevented the discharge of 9 500 000 m3 produced water to sea.
The Brage Field is located in the Norwegian sector of the North Sea and is operated by Norsk Hydro. Production commenced in September 1993 from five predrilled wells tied back to the steel jacket platform. The two main reservoirs, Fensfjord and Statfjord, are located at appx. 2100 and 2300 m TVD and consists of Jurassic age sandstones. One well has been drilled and completed in the Sognefjord reservoir, located in the north-eastern part of the field.
The platform is equipped with a two stage separation system incorporating gas compression/export facilities to Statpipe and oil export via Oseberg Field Centre. Water injection is the main drive mechanism at Brage. A WAG-scheme was implemented for several of the Fensfjord flank injectors shortly after field startup .
Jonkman, R.M. (International Oil & Gas Services) | Bos, C.F.M. (Netherlands Institute of Applied Geoscience TNO) | Breunese, J.N. (Netherlands Institute of Applied Geoscience TNO) | Morgan, D.T.K. (Uncertainty Management Ltd.) | Spencer, J.A. (Reserves Management Ltd.) | Søndenå, E. (Norwegian Petroleum Directorate)
On behalf of a group of sponsors consisting of the Norwegian Petroleum Directorate (NPD) and most E&P companies active in Norway, a workgroup was set up to author a report on the Best Practices and Methods in Hydrocarbon Resource Estimation, Production and Emissions Forecasting, Uncertainty Evaluation and Decision Making. The workgroup is part of Norway's forum for Forecasting and UNcertainty (FUN).
Following a detailed data acquisition and interviewing phase to make an inventory of the current practice of all sponsors involved, the workgroup postulated a relationship between a company's practices and its economic performance. A key distinguishing factor between companies is the degree to which probabilistic methods are adopted in integrated multi-disciplinary processes, aimed at supporting the decision making process throughout the asset life cycle and portfolio of assets.
Companies have been ranked in terms of this degree of integration and best practices are recommended. In many companies a gap seems to exist between available and applied technology. Data and (aggregated) information exchange between Governments and companies is also discussed. A best practice based on their respective decision making processes is recommended.
Norway's forum for Forecasting and UNcertainty evaluation (FUN, ref. 1) was established in 1997, and has 18 member companies plus the Norwegian Petroleum Directorate (NPD). The forum is a Norwegian Continental Shelf arena to determine best practice and methods for hydrocarbon resource and emissions estimation, forecasting uncertainty evaluation and decision making. It focuses on matters related to forecasting and uncertainty evaluation of future oil and gas production. Its main purpose is to optimize the interplay between the private industry and the national authorities wishing to regulate their national assets.
The basic question that kicked off the FUN Best Practices project was whether the accuracy of Norway's historical production forecasts has been disappointing because of erroneous contributions from the companies or because of wrong aggregation by NPD. The question was posed which "Best Practices" could improve this situation. Whereas reserves form the basis for production, capex, opex and emissions forecasting, the decision making process in the various companies and national authorities links the various components together. Using the latest WPC/SPE guidelines for reserves reporting (allowing the use of probabilistic methods), the project concentrated on assessing the potential advantages of probabilistic techniques when used in combination withfully integrated asset management workflow processes.
After a discussion of the current practices of the various companies and authorities visited, "Best Practices" are formulated in the fields of estimating reserves, production, costs and emissions forecasting, decision-making, planning and communications. The paper concludes with recommendations on how to move from the "current practices" to the desired "Best Practices".
Recent non-optimistic fluctuations related to the oil price1 highlight the tendency of most of the oil companies in a cost-effective policy. The technology development is one of the key factors in the cost reduction strategy. As a consequence, the strategic scenario expected for the next future in the petroleum industry is an increase in finding new innovative technology solutions.
As of today in this scenario R&D activities, besides to greatly affect the strategic technologies portfolio, could have more influence than ever on the business.
However, future R&D projects could have an increasing onerousity and complexity related to greater technical and financing risks.
Introduction of new assessment techniques, in particular of economic and financial evaluation, might become decisive. In this paper a new tool supporting the selection of R&D project proposals related to strategical needs, expected economical benefits and expected redditivity, is shown. This is a computerised system that supports the decision-maker in the construction of the necessary estimates following an incremental procedure. R&D project are compared and ranked on the basis of a multiattribute evaluation. The economic and financial performance is evaluated on the basis of NPV, IRR and payback; probabilistic analysis and risk assessment also is carried out. Examples of ranking of R&D project proposals from internal R&D Dpt are also presented.
About 30 years ago a group of researchers working for MIT and headed by Meadows published2 the results of a study on the duration of oil and gas supply for the next decades. They reported that oil reserves could last only about 31 years and gas reserve for 38 years. That forecast should have meant that at the end of the 2002 we have no longer available oil and by the end of the 2010 all the reserves of gas should be exhausted.
That evaluation brought disconcert and panic amongst investitors and management. Oil companies, but also worldwide industries, were forced to face a decreasing energetic supply scenario without having valid and effective alternatives. Fortunately, statements of Meadows' report were misleading because of the constraints utilised, not so well understood by most of the economists, such as keeping constant the discovery rate and the oil and gas consumption. Presently, these statements could be considered only the results of a good academic simulation of the reality having particular boundary condition but not at all reflected the time that came.
In fact, new giant discovered over the last three decades, and many other fields exploited at low cost by using new technologies not still available at the time of Meadows' study greatly increased the reserves. As of today, fields at depth not certainly considered at that time, such as 7-8 kilometre and more, can be put easily on production. Wells, 13-14 kilometre long and at most with horizontal section, can be drilled and put on production. New fields can be discovered under the seabed with 2-3 kilometre of water depth.
New technology solutions opened windows in exploiting new area at that time considered not economical, increased the productivity and the recovery factor of existing area, etc. So, the ability to develop innovative technologies in order to exploit economically area before not profitable proved fundamental.
Maybe, the error done by the economists at that time was to consider the evolution of this activity and the oil business as static without further changes and destined to disappear. But the reality in its globality always is dynamic and the dreadful problem of increasing the discovery rate and the reserves has been abundantly overcome by means of the new technologies mainly due to the results of R&D efforts over the last decades.
Varvelli et al.3 state we still have oil and gas reserves for about one hundred years and set the end of the oil and gas supply approximately by the next 2100. Also, in his article he indicates that much hardly we will get in this end because of other energy supply, most of them already known (such as eolic, solar, biomass, hydroelectric, geothermal, etc) and others not still known, will substitute the energy from oil and gas without seeing its exhaustion, such as happened with coal supply.
The laboratory and field studies were focused on development of a well treatment technology which can be used for restriction of gas coning in the Algyö reservoir. The method is based on simultaneous placement of a polymer/silicate gel at the GOC and a supporting foam pillow into the oil bearing layer. Extensive laboratory studies were carried out to tailor the chemical system to the field conditions. The use of short chain alcohol in the gel-forming system and natural protein or glycerine as additive in the foam-forming solutions were in the focus of improvements. Special attention was paid to implementation of the technology. Two-point injection regime was elaborated to develop a horizontal barrier between the oil and gas bearing zones. Three projects consisting of five treatments, one of the a repeated one, were carried out between 1991 and 1999. The field test resulted usually partial success: the hydrodynamic measurement definitely shown that the barrier was correctly placed, the oil flow in the pay zone was not effected, the GOR temporarily improved and the WOR positively changed. The long-term observations, however, definitively prove that application of foam pillow instead of other supporting media increase significantly the chance to have measurable change in the unwanted gas production. On the other hand, however, the investment and the return rate of expenditure is not balanced even in those cases when the treatment is qualified as successful in technical sense. The Hungarian experts take a stand on extending the observation period for more cycles and to collect abundant arguments to continue or to suspend the R&D activities and the field projects.
Restriction of gas coning and dipping in vertical and horizontal wells has been in the focus of production engineering during the past decades. Earlier the laboratory and field studies and further, the mathematical simulation were primarily aimed at determination of the right position and length of perforation or/and the maximum critical oil rate which provide the most profitable and permissible gas-free oil production1. That fundamental concept changed only a little through many years, although valuable results of the physical modeling and mathematical simulations came to light in literature2,3. A real break-through, however, was not brought to surface despite the facts that understanding and prediction of displacement processes in gas capped systems became much more precise4,5. Despite useful contributions to the topic, the field management of complex reservoirs having unfavorable gas/oil mobility and thin oil bearing layers still represents a great challenge even today6, and the operators still have high hopes on application of horizontal wells7-9 in gas capped reservoirs. In fact, the sever water cresting and gas dipping problems in operating horizontal wells significantly accelerated the lab and field studies and urged the search for a practical method to ease that production problem7-10. However, it is also clear that improvement of oil rate (e.g. placing the wellbore or perforation at about 70 % of the reservoir thickness8) is often limited, particularly when the pay zone is thin and an aquifer is present. Nevertheless, optimum placement of the horizontal wellbore and precise determination of the critical oil rate are the mainstays of technologies even today to prevent the unfavorable gas production (coning and dipping) in oil wells.
An investigation of alternative EOR processes having potential application in the giant Ekofisk chalk field is presented. Technical feasibility, process readiness, oil recovery potential, and related uncertainties and risks of five selected EOR processes, namely hydrocarbon (HC) WAG, nitrogen (N2) WAG, carbon dioxide (CO2) WAG, air injection and microbial EOR (MEOR), are assessed for possible application at Ekofisk. The objective of the screening study was to evaluate and rank the EOR alternatives and to select the most attractive process(es) on which to pursue further work toward possible field pilot testing. The focus of the paper is on the technical assessment of the relative oil recovery potential of each process, and on the importance of identifying critical operational and logistical considerations for implementation of an EOR processes in the offshore North Sea operating environment.
Estimates of potential EOR incremental oil recovery for the Ekofisk field can be quite significant. However, key project development and implementation issues and additional cost elements must be weighed equally with oil recovery forecasts in any EOR process ranking. Some of these issues (e.g. injection gas supply, facilities requirements, and the impact of EOR on chalk compaction, subsidence and wellbore integrity) may be significant enough to eliminate a process from further consideration.
In addition, there are significant differences in the quantity and quality of key laboratory and field data supporting the viability of the various EOR processes being considered. Only a limited amount of field-specific data are available to calibrate the performance predictions for some of the processes. There is also a wide variation in the technical readiness of each process to begin field pilot design studies. Table 1 summarizes the state of technical readiness for field implementation of each process and identifies some of the major risk elements and remaining work required to progress these EOR processes at Ekofisk.
The Ekofisk Field is located in the Norwegian Sector of the North Sea, Figure 1. The reservoir is an elongated anticline with the major axis running North-South covering roughly 12,000 acres, Figure 2. It produces from two fractured chalk horizons, the Ekofisk and Tor Formations, separated by a tight zone. The overlying Ekofisk Formation has a depth of about 9,600 feet and thickness varies from 350 - 500 feet with porosities from less than 30% to 48%. The underlying Tor Formation thickness varies from 250 - 500 feet with porosities from less than 30% to 40%. About two thirds of the 6.4 billion STB OOIP is in the Ekofisk Formation. The initial reservoir pressure was 7135 psia at a depth of 10,400 feet. The field initially contained an undersaturated volatile oil with a bubble point pressure of 5560 psia at the temperature of 268°F.
Ekofisk1 was discovered in 1969 and test production was started in 1971 from the discovery well and three appraisal wells. Commercial test rates prompted development of the field from three platforms. Permanent facilities with 54 well slots and 300,000 STB/D (design capacity) process facilities were operational in May 1974. Development drilling was started June 1974. Oil production peaked at 350,000 STB/D in October 1976. Produced gas was reinjected2 until a gas pipeline was installed to Emden, Germany, September 1977.
The paper presents a new methodology for building integrated risk models for reservoir and operational costs. Although the resulting models are fairly simple, they have proved to be adequate in several real-life applications in large field development projects as a part of a total value chain risk analysis. By using these models it becomes possible to optimize intervention strategies with respect to the total net present value of the project or other success criteria such as break-even price.
When analyzing operational costs related to an oil or gas field, one usually develops a static cost profile based on the assumed expected yearly production throughout the lifetime of the field. However, when reservoir uncertainty is introduced, this approach breaks down, as different production scenarios must be matched with corresponding operational cost profiles. A typical risk analysis involves carrying out some sort of Monte Carlo simulation. To be able to do this, it is necessary to establish a more dynamic link between reservoir and operational costs. While some of these costs are more or less constant independent of the production, others depend heavily on how much the reservoir produces. In this paper we focus on the last category. It is natural to divide such costs into two subcategories: rare events with high costs, e.g., well interventions such as side-track operations, and frequent events with low costs, typically scale squeeze operations. Two different models will be presented dealing with these issues. For the rare events with high costs case, we present a lifetime model where the time is measured in cumulative production. For the frequent events with low costs, the relation between production and operational costs is developed based on curve fitting.
In recent years, several oil field development projects in the North Sea have experienced major cost overruns of a magnitude that threatened their profitability. The planned versus the actual costs, the time it takes to reach ‘first oil', the return of the investment, etc. are now subject to a much more thorough investigation (See Kaasen1.), and public debate, than earlier. Recently the NORSOK initiative, (Matland et al.2) put the focus on economics in offshore projects. The interest in this issue became even stronger as the oil price dropped in 1998 and beginning of 1999. If the cost overruns combined with low oil prices prevailed, several large oil fields would no longer be profitable, and few new fields would be developed.
Although the issue is rather complex, there is one question that deserves to be in focus of the present debate: Do the traditional economic analyses carried out before a field is decided to be developed, properly reflect the risks involved? By calculating the risk premium as an addition to the ‘risk-free' cost of capital, and by using the resulting interest to discount the cash flows, one conceals the spread, the size of the up- and downside risk inherent in the investment. In order to describe all aspects of the risks involved, it is necessary to model the different uncertainties and dependencies explicitly.
In recent years, it is shown that the inverse problem theory based on Bayesian estimation provides a powerful methodology not only to generate rock property fields conditioned to both static and dynamic data, but also to assess the uncertainty in performance predictions. To date, standard applications of inverse problem theory given in the literature assume that rock property fields obey multinormal distribution and are second order stationary. In this work, we extend the inverse problem theory to cases where rock property fields (only porosity and permeability fields are considered) can be characterized by fractal distributions. Fractional Gaussian noise (fGn) and fractional Brownian motion (fBm) are considered. To the authors' knowledge, there exists no study in the literature considering generation of fractal rock property fields conditioned to dynamic data; particularly to well-test pressure data.
We show that available Bayesian estimation techniques based on the assumption of normal/second-order stationary distributions can be directly used to generate conditional fGn rock property fields to both hard and pressure data because fGn distributions are normal/second-order stationary. On the other hand, we show that because fBm is not second-order stationary, these Bayesian estimation techniques can only be used with implementation of a pseudo-covariance (generalized covariance) approach to generate conditional fBm fields to static and well-test pressure data.
Using synthetic examples generated from two and three-dimensional single-phase flow simulators, we show that the inverse problem methodology can be applied to generate realizations of conditional fBm/fGn porosity and permeability fields to well-test pressure data. We conclude by showing how one can then assess the uncertainty in reservoir performance predictions appropriately using these realizations.
Recent studies1,2 have presented procedures based on inverse problem theory of Tarantola3 to generate realizations of rock property fields conditioned to a prior geostatistical model (means and variograms), hard data, and multiwell pressure data for one, two and three dimensional single-phase flow problems. These studies show that the inverse problem theory based on Bayesian estimation coupled with their procedures provides a powerful methodology not only to generate rock property fields conditioned to different data types, but also to assess the uncertainty in reservoir performance predictions. To date, standard applications of inverse problem theory given in the literature1-2,4-6 assume that rock property fields obey multinormal (Gaussian) distribution and are second order stationary.
Recent studies7-8 have shown that fractals like fractional Gaussian noise (fGn) and fractional Brownian motion (fBm) are promising approaches to characterize porosity and/or permeability, in general the hydraulic property variations in the subsurface. Samples of geologic property variations observed in some horizontal and vertical well logs show the persistence of fGn with Hurst coefficient, H>0.7 (see Refs. 7 and 8).
In the literature, there exist stochastic interpolation methods7,9 that can be used to generate conditional fractal simulations honoring variograms and hard data (measurements of porosity and permeability at wells). However, there exists no study in the literature that considers generating fractal fields conditioned to dynamic data, in particular to well-test pressure data. Thus, the objective of this work is to generate fractal (fGn and fBm) porosity and permeability fields conditioned to variograms, hard data and well-test pressure data by using the inverse problem theory.
P/Z plots of tight gas reservoirs based on non-extrapolated pressures are generally curved and can deviate strongly from the theoretical straight line. Interpretation of these plots by the conventional straight-line method may significantly underestimate GIIP. The paper presents and explains the p/z behavior of tight gas reservoirs for a number of simple prototype reservoirs. It is shown that in many cases the curved p/z plots can still be interpreted quantitatively. The key to the interpretation is the concept of semi-steady-state time. Even if no quantitative interpretation is possible, a p/z plot is still a powerful diagnostic tool. When p/z plots can not be analyzed quantitatively, one has to resort to history matching of shut-in bottomhole pressures and of continuously measured wellhead pressures by using a numerical reservoir simulator.
Material balance (p/z) analysis is a popular and powerful tool for the evaluation of the production performance of gas reservoirs. Application to tight gas reservoirs is not straightforward, however, and its results can be grossly misleading. Some even rule out the application of material balance analysis to tight gas reservoirs altogether because depletion of tight gas reservoirs would violate the basic assumptions of the material balance method (Ref. 1). The main problem is not so much that the material balance assumptions no longer hold (mass conservation also works in tight gas reservoirs) but that it becomes difficult to obtain representative reservoir pressures from observed shut-in pressures. This is because for tight gas reservoirs the shut-in periods are generally too short for a stable well pressure to develop. In addition, production from a tight gas reservoir may take place under transient conditions for a long time. As a result only a small part of the reservoir will be influenced by production.
The purpose of this paper is (1) to present and explain the main characteristics of the p/z behavior of tight gas reservoirs, and (2) to provide guidelines for the interpretation of p/z plots of tight gas reservoirs. To this end we have numerically simulated p/z behavior of the following prototype reservoirs: (1) a homogeneous reservoir depleted by a single vertical production well in the center, (2) a homogeneous reservoir depleted by a single vertical production well off-center, (3) a reservoir depleted by two production wells of different flow capacity, (4) a faulted reservoir depleted by a single well, and (5) a commingled stratified reservoir depleted by a single well.
We have generated the p/z data points by shutting in the production wells periodically and by taking the pressures at the end of the shut-in periods as the observed pressures for the p/z plots. As for the depletion process we have restricted ourselves to volumetric pressure depletion. The true p/z curve for this kind of depletion is a straight line, no matter how complex the reservoir is. In the analyses of the observed p/z plots we have focussed primarily on the estimation of gas-initially-in-place.
The properties of the prototype reservoirs have been selected to cover ranges that are typical of North Sea tight gas reservoirs. The initial pressure is 300 102kPa. The porosity is 10 percent. The permeabilities range from 0.01 to 10 mD. The thickness of the reservoir is 50 m. The chosen gas properties represent a typical dry gas with a gas gravity of 0.65. For the production constraints we have assumed a maximum well rate of 1 106m3/d and a minimum wellhead pressure of 20 102kPa throughout the production history. In all cases we have simulated a total production time of 15 years.
We have performed the simulations with the commercial IMEX black-oil reservoir simulator developed by the Computer Modelling Group. We have made sure and verified that IMEX accurately simulates the pressure distribution in the prototype gas reservoirs during depletion and during shut-in periods. Appendix A details the simulation input data for each of the prototype reservoirs.