Tight gas is the term commonly used to refer to low permeability reservoirs that produce mainly dry natural gas. Many of the low permeability reservoirs that have been developed in the past are sandstone, but significant quantities of gas are also produced from low permeability carbonates, shales, and coal seams. Production of gas from coal seams is covered in a separate chapter in this handbook. In this chapter, production of gas from tight sandstones is the predominant theme. However, much of the same technology applies to tight carbonate and to gas shale reservoirs. Tight gas reservoirs have one thing in common--a vertical well drilled and completed in the tight gas reservoir must be successfully stimulated to produce at commercial gas flow rates and produce commercial gas volumes. Normally, a large hydraulic fracture treatment is required to produce gas economically.
As defined by the U.S. Federal Energy Regulatory Commission (U.S. FERC), low-permeability ("tight") gas reservoirs have an average in-situ permeability of 0.1 md or less. Others have placed the upper limit at 1 md. Estimates of ultimate recovery from these resources vary widely and depend chiefly on assumptions of wellhead gas price. Methods for estimating gas reserves in moderate- to high-permeability reservoirs are unreliable in very-low-permeability reservoirs. The unreliability can be attributed to the geologic setting in which these reservoirs occur and the completion methods required to make them commercial.
Each of these is discussed briefly in the next two sections. Thereafter--except for another section on probabilistic procedures near the end--the chapter will focus on deterministic procedures because they still are more widely used. Both procedures need the same basic data and equations. Reserves calculated using such procedures are classified subjectively on the basis of professional judgments of the uncertainty in each reserve estimate and/or of pertinent regulatory and/or corporate guidelines. Probabilistic procedures recognize that uncertainties in input data and equations to calculate reserves may be significant.
In the 1970s, the United States government decided that the definition of a tight gas reservoir is one in which the expected value of permeability to gas flow would be less than 0.1 md. This definition was a political definition that has been used to determine which wells would receive federal and/or state tax credits for producing gas from tight reservoirs. Actually, the definition of a tight gas reservoir is a function of many factors, each relating to Darcy's law. The main problem with tight gas reservoirs is that they do not produce at economic flow rates unless they are stimulated--normally by a large hydraulic fracture treatment. Eq. 7.1 illustrates the main factors controlling flow rate. Eq. 7.1 clearly shows that the flow rate, q, is a function of permeability k; net pay thickness h; average reservoir pressure p; flowing pressure pwf; fluid properties β μ drainage area re; wellbore radius rw; and skin factor s. Thus, to choose a single value of permeability to define "tight ...
Temizel, Cenk (Aera Energy) | Balaji, Karthik (University of North Dakota) | Canbaz, Celal Hakan (Ege University) | Palabiyik, Yildiray (Istanbul Technical University) | Moreno, Raul (Smart Recovery) | Rabiei, Minou (University of North Dakota) | Zhou, Zifu (University of North Dakota) | Ranjith, Rahul (Far Technologies)
Due to complex characteristics of shale reservoirs, data-driven techniques offer fast and practical solutions in optimization and better management of shale assets. Developments in data-driven techniques enable robust analysis of not only the primary depletion mechanisms, but also the enhanced oil recovery in unconventionals such as natural gas injection. This study provides a comprehensive background on application of data-driven methods in oil and gas industry, the process, methodology and learnings along with examples of data-driven analysis of natural gas injection in shale oil reservoirs through the use of publicly-available data.
Data is obtained and organized. Patterns in production data are analyzed using data-driven methods to understand key parameters in the recovery process as well as the optimum operational strategies to improve recovery. The complete process is illustrated step-by-step for clarity and to serve as a practical guide for readers. This study also provides information on what other alternative physics-based evaluation methods will be able to offer in the current conditions of data availability and the understanding of physics of recovery in shale oil assets together with the comparison of outcomes of those methods with respect to the data-driven methods. Thereby, a thorough comparison of physics-based and data-driven methods, their advantages, drawbacks and challenges are provided.
It has been observed that data organization and filtering takes significant time before application of the actual data-driven method, yet data-driven methods serve as a practical solution in fields that are mature enough to bear data for analysis as long as the methodology is carefully applied. The advantages, challenges and associated risks of using data-driven methods are also included. The results of comparison between physics-based methods and data-driven methods illustrate the advantages and disadvantages of each method while providing the differences in evaluation and outcome along with a guideline for when to use what kind of strategy and evaluation in an asset.
A comprehensive understanding of the interactions between key components of the formation and the way various elements of an EOR process impact these interactions, is of paramount importance. Among the few existing studies on natural gas injection in shale oil with the use of data-driven methods in oil and gas industry include a comparative approach including the physics-based methods but lack the interrelationship between physics-based and data-driven methods as a complementary and a competitor within the era of rise of unconventionals. This study closes the gap and serves as an up-to-date reference for industry professionals.
Production from highly paraffinic crude oil wells poses unique technical challenges such as poor flowability and paraffin deposition on the production tubing. Paraffin deposition increases the lift load on the pump, reduces pump efficiency, and eventually plugs the pump. To restore the productivity of these wells a common solution is to inject hot oil or hot water at 160°F–200°F to clean the deposits. This process imposes higher operating cost and lost production due to well downtime.
Paraffin inhibitor (PI) and pour point depressant (PPD) have been used to treat paraffinic fluids but are not effective for wells with high water cuts. These wells when treated with PI/PPD still require high cost maintenance such as the hot oil/water jobs and/or well workover. This paper presents a more effective treatment using tailored chemical mixtures to form a water dispersion with the paraffinic oil, thus to increase oil flowability and reduce deposition. A novel test method has been developed to evaluate effectiveness of treatment chemicals on various paraffinic oils based on flowability and cleanliness. The test method has been validated with field trial data from three different wells in the Uinta Basin, Utah and Julesburg Basin, Colorado.
The results of the field trials showed a significant increase in pumping efficiency and crude oil production. Need for hot water application was also reduced or eliminated for the treated wells. Improved oil and produced water quality were also observed. These results demonstrated that the water dispersion-based treatment is a more effective treatment for high paraffin wells with high water cuts.
A major shale producer in North America with established oil and gas production was facing challenges with severe paraffin deposition in downhole tubing and flowlines. Since chemical recommendations based on traditional screenings failed to deliver adequate inhibition, the operator turned to a costly remediation program to maintain production. We aimed to revisit the case, do a root cause analysis, and look for a potential chemical solution for cost savings. The field deposit obtained from the producer proved to be quite complex and introduced limitations with our current internal HTGC method for carbon chain analysis. Upon analysis, components present in the sample were found to exceed the solidity limits of the carrier system, carbon disulfide (CS2) and would precipitate out of the solution and form a two-phased system. These components were believed to be higher molecular weight carbon chains (HMWC) above C70+ at a high enough concentration to exceed the solvents solubility limit. This was the first time encountering such a sample in our experience. A systematic approach was applied to isolate the insoluble HMWC and further outsourced analysis. A MALDI-TOF and High-Resolution Carbon-13 NMR was utilized to confirm the presence of C90+ chains within the deposit at a high enough concentration to have a trimodal paraffin distribution system. To our knowledge, this is the first time a trimodal system has been documented.
Reliable estimation of organic matter characteristics is essential in drilling decisions, source rock evaluation, and unconventional reservoir production. Their measurement is based on experiments after core sampling, which is time-consuming and economically challenging. In this study, we present a new approach to evaluate the characteristics of organic matter in source and reservoir rocks by in-situ electrical heating and temperature transient analysis under in-situ conditions.
The new approach is based on inverse modeling, which monitors in-situ heater temperature during electrical heating and machine learning technologies. Thermal method of electrical heating is applied for the in-situ pyrolysis, to figure out the characteristics of organic matter—kerogen volume fraction and activation energy of decomposition reaction. The heater temperature acts as an indicator of type and maturity of kerogen, since it is affected by the bulk thermal conductivity of formation, which is a function of dynamically changing rock-and-pore composition by kerogen decomposition. A full-physics simulation model of in-situ kerogen pyrolysis is used to generate output data of electrical heater temperature, which is the input data of learning-based models. Minimal simplification of physical and chemical phenomena in the full-physics simulation model, which describes the multicomponent-multiphase-nonisothermal systems involving kinetic reactions, gives the confidence of synthetic output data of heater temperature.
Full-physics simulation model computes system responses under unknown and uncertain input parameters, which determine the reactivity of kerogen pyrolysis. The full-physics simulation model generates the sets of heater temperature transient data while heating with constant heat flux, in the 300 different simulated source rocks containing Types 1, 2, and 3 kerogens with various organic matter content and activation energies. Based on the set of heater temperature transient data as input parameters, Artificial Neural Network (ANN) is employed to generate a black box model to estimate the unknown organic matter content and activation energy. Developed ANN data-driven model shows better performance in estimating unknown parameters, in Types 2 and 3 kerogens with wide ranges of activation energies than Type 1 kerogen with a narrow range of activation energy. Support Vector Machines (SVM) method, which categorizes data into multiple classes by using hyperplanes, is applied to classify the heater temperature transient data into different types of kerogens and shows good performance in classification.
The new characterization technology of in-situ organic matter in source rocks presented in this study provides reliable information of types and maturity of organic matter, without experiments after core sampling. It is expected to enable the realistic evaluation of source rocks under subsurface conditions, by resolving technical and economic challenges.
The development of unconventional resources is capital intensive and challenging where operators spend a large amount of resources to maximize value. This is a direct result of completing thousands of wells with multistage fracturing. The optimization of well completion to enhance hydrocarbon recovery will help to reduce development costs and enhance project economics under the uncertainty parameters: geological, engineering, and economic.
The paper demonstrates a novel workflow as an effective way to optimize completion design by integrating advanced multi-stage fracture modeling with reservoir simulation in an unconventional resource play. This work shows an integrated workflow using a compositional dynamic simulation study for gas condensate well. The complexity of gas flow physics in both nano-darcy reservoir as well as hydraulically fractured Stimulated Rock Volume (SRV) are considered. The physics include gas desorption, pressure dependent permeability, non-Darcy flow and gas condensate fluid behavior.
The workflow includes QA/QC of the geologic model with a fine model resolution to map the hydraulic fractures. Long-term flow back data is used to calibrate the simulation model using history matching regions following the analytical trilinear model. After achieving a reasonable history matching, a detailed uncertainty assessment was performed to estimate P10, P50 and P90 of the well's EUR (Estimated Ultimate Recovery) using Proxy modeling workflow. Uncertainty parameters include hydraulic fracture half-length, SRV permeability, dew point pressure, under-saturated desorption pressure, rock compaction trend, etc.
Finally, what-if scenarios were performed to assess the impact of cluster spacing, fracture height, horizontal well length and minimum well head pressure (WHP) on the well's EUR.
The results of this work illustrates the workflow used to optimize well completion design including the number of stages along the lateral, length of the lateral, treatment sizes and how it impacts well performance as well to support management decision making.
The US Energy Information Administration (EIA) has added new play production data to its shale gas and tight oil reports. Last December, US shale and tight plays produced approximately 65 Bcf/D of natural gas and 7 million B/D of crude oil, accounting for 70% and 60% of US production in those areas, respectively. These totals represent a significant jump in the last 10 years: shale gas and tight oil accounted for 16% of total US gas production and approximately 12% of US total crude oil production, according to EIA statistics. EIA updated its production volume estimates to include seven additional shale gas and tight oil plays, increasing the share of shale gas by 9% and tight oil by 8% compared with previously estimated shale production volumes. The change captures increasing production from new, emerging plays as well as from older plays that had previously been in decline, but are now rebounding because of advancements in horizontal drilling and hydraulic fracturing.