Determining the potential of shale-gas reservoirs involves an exhaustive process of calculating the volume of total gas, or original gas in place (OGIP). The calculation of total gas relies on calibrating wireline logs to core data, which are considered to be an empirical validation or ‘ground truth’. However, inconsistency in sample preparation and analytical techniques within, and between laboratories creates significant uncertainty in calculating the free- and adsorbed-gas components, which constitute total gas. Here, we present an analytical program performed on samples of core to elucidate the causes of uncertainty in calculation of total gas. The findings of this program are used to propose improved methods of calculating total gas from core.
Free gas calculated from properties, such as porosity and water saturation measured on core, was found to be highly dependent on laboratory analytical protocols. Differences in sample preparation and water extraction methods led to relative differences of 20% in water saturation and 10% in porosity observed between laboratories, leading to differences of 35% in calculations of free gas in place (FGIP).
Adsorbed gas was evaluated using methane adsorption testing to study the changes in Langmuir parameters in samples with a wide variety of water saturations, clay content, and total organic content over a range of temperatures. It was found that the storage capacity of adsorbed gas artificially increased by a factor of two to three when the experimental temperature exceeded the boiling point of water. This increase is related to the expulsion of clay-bound water and subsequent availability of clay surfaces for methane adsorption.
Total gas in place (TGIP) is the sum of free and adsorbed gas volume estimates. The interaction and overlap of pore space between these two volume components are also important to consider. It is proposed to use a simplistic monolayer-based correction of volume of adsorbed gas from the free-gas volume based on a composite pore-size distribution from scanning electron microscopy (SEM) point-counting and nitrogen-adsorption data.
Pressurized sidewall-core samples were acquired at reservoir conditions to measure free- and adsorbed- gas volumes during controlled depressurization under laboratory conditions. This provided a baseline measurement for comparison with calculations from traditional measurements to understand which laboratory protocol and sample preparation technique provided the most robust results.
This study has elucidated methods to reduce the uncertainty in gas-in-place calculations and better understand resource distribution in dry-gas source rocks.
The objective of this paper is to explore the benefits of using the Interactive Epoch-Era Analysis (IEEA) methodology for evaluating architectural changes in a trade space exploration study. In this paper a subsea tieback offshore Brazil will be used as reference case to investigate this premise from a full field development perspective.
An automated concept exploration tool is employed. It applies meta-heuristics to generate different offshore facilities concepts with varying building blocks. The interaction between reservoir behavior and facilities design is accounted for, meaning pressure and temperature losses throughout the system are taken into account in each concept differently. These concepts are ranked in terms of economic performance indicators (NPV, IRR, etc.), and each run with a given set of boundary conditions covers what is called an Epoch. This process is iterated for the whole life of field with a set of different boundary conditions, such as commercial aspects ($/bbl, $/MMBtu, market demand) and/or technological maturity aspects (TRL, novel technological concepts), generating what is called an Era. The whole data set is then evaluated in an interactive platform thru the Humans-In-the-Loop (HIL) process.
Model Based Systems Engineering (MBSE) is being employed successfully in other engineering fields outside the O&G context such as the aerospace and automotive industries. While digital tools have been identified as a potential key contributor to the future of O&G performance enhancement and further cost reductions, that is yet to be shown. This work intends to provide backing for that argument in one of the potential applications during early concept exploration phases by showing that quick high value assessments following an MBSE approach may be carried out, once significant effort has been put into proper development, verification and validation (V&V) of such digital tools.
While integrated models for asset development have long been a subject of interest for O&G operators, the application of Systems Engineering concepts to it has not yet been thoroughly explored. Systems Engineering provides a rigorous and proven method of dealing with complex systems that is highly applicable to offshore field developments. MBSE is the current State of the Art for capital intensive projects such as space exploration spacecrafts and rovers. Learning from these successful use cases and applying these methodologies in the development of digital technologies may provide a new set of tools in the belt of O&G operators Facilities Engineers and alike. The study case presented shows MBSE’s capability of capturing intrinsic non-linearities and specificities of each O&G field/location while ensuring project wide functional requirements are successfully met.
Time does not feature in the equations. However, there are significant advantages if time is incorporated into the analysis. For example: a) identifying if all the wells belong to the same reservoir; b) identifying the effect of external energy sources such as gas or water drive; c) incorporating the contribution of communicating tight reservoirs; d) visualization of the results in pressure-time format. The time-based analysis presented in this paper supplements the conventional methods. It helps reduce the non-uniqueness of the solution. In contrast to the conventional Havlena-Odeh plotting variables, which are complex and non-intuitive, the pressure-time plot and corresponding pressure-history match are much easier for an engineer to comprehend and to evaluate the validity or uniqueness of the results.
Straight-line analysis (SLA) methods, which are a sub-group of model-based techniques used for rate-transient analysis (RTA), have proven to be immensely useful for evaluating unconventional reservoirs. Transient data can be analyzed using SLA methods to extract reservoir/hydraulic fracture information, while boundary-dominated flow data can be interpreted for fluid-in-place estimates. Because transient flow periods may be extensive, it is also advantageous to evaluate the volume of hydrocarbons-in-place contacted over time to assist with reserves assessment. The new SLA method introduced herein enables reservoir/fracture properties and contacted fluid-in-place (CFIP) to be estimated from the same plot, which is an advantage over traditional SLA techniques.
The new SLA method utilizes the
Validation of the new SLA method for an undersaturated oil case is performed through application to synthetic data generated with an analytical model. Thenew SLA results in estimates of LFP and OFIP that are in excellent agreement with model input (within 2%). Further, the results are consistent with the traditional SLA methods used to estimate LFP(e.g. the square-root of time plot) and OFIP (e.g. the flowing material balance plot).
Practical application of the new SLA method is demonstrated using field cases and experimental data. Field cases studied include online oil production from a multi-fractured horizontal well (MFHW) completed in a tight oil reservoir, and flowback water production from a second MFHW, also completed in a tight oil reservoir. Experimental (gas) data generated using a recently-introduced RTA core analysis technique, were also analyzed using the new SLA method. In all cases, the new SLA method results are in excellent agreement with traditional SLA methods.
The new SLA method introduced herein is an easy-to-apply, fully-analytical RTA technique that can be used for both reservoir/fracture characterization and hydrocarbons-in-place assessment. This method should provide important, complementary information to traditionally-used methods, such as square-root of time and flowing material balance plots, which are commonly used by reservoir engineers for evaluating unconventional reservoirs.
An innovative multi-deterministic scenario workflow was applied to one of the giant and complex carbonate reservoirs in the Middle East. The application of this workflow had the objective to quantify how geological uncertainties and different modelling decisions impact the stock tank oil-initially-in-place (STOIIP) estimates and flow behaviour in this reservoir. In particular, we focused on the uncertainties related to the presence of fractures, reservoir rock typing, and modelling the initial hydrocarbon distribution.
Based on the available static and dynamic data we considered two key scenarios, the absence of fractures and the presence of sparse, fault-controlled fractures. In the first scenario, we investigated how different reservoir rock typing methods impact permeability distributions. We further quantified changes in hydrocarbon distribution and analysed how a novel approach that combines capillary pressure and log-derived J-function affects the saturation models. In the second scenario, we used the effective medium theory to calculate permeability multipliers for the regions where fractures are expected. This enabled us to effectively represent fractures in a single-porosity reservoir model. The representativeness of the different models was analysed through blind tests using static data as well as history matching using dynamic data.
The most significant findings of our work are that subtle changes in modelling decisions and reservoir rock typing have major consequences for the saturation model, leading to up to 20% change in STOIIP estimates. Such uncertainties must be carried forward in future reservoir management decisions and when estimating reserves. The blind tests showed that a saturation model based on the combination of core- and log-derived J-functions gave the most robust STOIIP estimates. These particular saturation models further led to a much-improved history match, especially for wells located in the transition zone of the reservoir. The best history matches were obtained once sparse, fault-controlled fractures were included in the reservoir model using effective medium theory. The presence of fractures specifically improved the history matching quality for wells located close to the faults; these wells were very difficult to match in the past.
Our work clearly demonstrates that a multi-deterministic scenario workflow is key to explore the appropriate range of geological uncertainties, and that, equally important, the impact of different modelling decisions must be accounted for when quantifying uncertainty during reservoir modelling. This is particularly applicable to giant carbonate reservoirs where relatively minor changes in the workflow and data interpretation can have major consequences on STOIIP estimates, dynamic behaviours, and reserve estimates. Multi-stochastic modelling workflows which anchor the reservoir to a single base case are not capable of achieving this.
One of the world’s leading energy watchers says the second shale revolution will come in the form of LNG exports. After 70 years of production, more than 30% of the Arab C reservoir stock-tank original oil in place has been recovered through various mechanisms including natural depletion, waterflooding, gas lift implementation, and horizontal-well development. The North field offshore Qatar was observed to have a chance of inner annuli becoming charged with shallow-gas pressure with possible communication to other annuli, which was thought to be a well integrity concern. Airborne imaging spectroscopy has evolved dramatically since the 1980s as a robust remote-sensing technique used to generate 2D maps of surface properties over large areas.