Any procedure that extracts and displays these seismic parameters in a convenient, understandable format is an invaluable interpretation tool. Taner and Sheriff and Taner et al. began using the Hilbert transform to calculate seismic amplitude, phase, and frequency instantaneously, meaning that a value of amplitude, phase, and frequency is calculated for each time sample of a seismic trace. Since that introduction, numerous Hilbert transform algorithms have been implemented to calculate these useful seismic attributes (e.g., Hardage). Figure 1 illustrates the concept of a complex seismic trace in which x(t) represents the real seismic trace and y(t) is the Hilbert transform of x(t). In this discussion, we ignore what a Hilbert transform is and how the function y(t) is calculated.
It is often necessary to investigate the forecast uncertainty for a portfolio of fields or reservoirs to evaluate, for example, the risks and opportunities of an exploration portfolio, of a new business strategy, for an "urban planning" study or to evaluate uncertainty in the regional portfolio. It is important to understand whether the portfolio uncertainties are dependent or independent. Cases 1, 2 and 3 are often dependent with complex interactions of the parameters and strong inter-dependencies (both positive and negative correlations) and with common system constraints. In this case, a comprehensive Monte Carlo analysis of the system is recommended that includes all the complex system interactions. Aggregation tools are available in the industry to do this complex probabilistic aggregation after individual field forecasts have been generated; however, most IPSM tools have the ability to evaluate the system uncertainty for all assets concurrently and the latter approach would be preferred, but is sometimes considered too time-consuming.
A pixel-based model assumes that the variable to be simulated is a realization of a continuous (Gaussian) random function. Using the spatial model, search ellipse, and control data, a pixel-based method simulates values grid node by grid node. Some of the most popular pixel-based algorithms are: turning bands, sequential Gaussian, sequential indicator, truncated Gaussian, and simulated annealing. Each method can produce a range of realizations that capture the uncertainty of an regionalized variable (RV), and so the method choice here will be based on the goals and on data types and their availability. The pixel-based method works best in the presence of facies associations that vary smoothly across the reservoir, as often is the case in deltaic or shallow marine reservoirs.
Over the years, attempts have been made to track the working history of coiled tubing (CT) strings in service to maximize the service utility of the tube while minimizing fatigue failures. As a result, three commonly used methodologies for predicting the fatigue condition of the CT were developed. A relatively simplistic approach used to predict the working life of coil tubing is commonly described as the "running-feet" method, in which the footage of tubing deployed into a wellbore is recorded for each job performed. This deployed footage is then added to the existing record of footage deployed in service for any given string. Depending upon the service environment, type of commonly performed services, and local field history, the CT string is retired when the total number of running feet reaches a predetermined amount.
Integral transforms are useful in solving differential equations. A special form of the linear integral transforms, known as the Laplace transformation, is particularly useful in the solution of the diffusion equation in transient flow. The following fundamental properties of the Laplace transformation are useful in the solution of common transient flow problems. For the Laplace transform to be useful, the inverse Laplace transformation must be uniquely defined. In this operation, p(t) represents the inverse (transform) of the Laplace domain function, .
Shale gas condensate is known to form more readily in smaller than larger pores at the same reservoir conditions and can reduce the mobility of the gaseous phase significantly not only in individual pores but also in a pore system, to limit gas production. To investigate the interplay of fluid confinement factors on effective gas flow behaviors in shale, in this work we developed a new phase equilibrium calculation algorithm for evaluating phase properties in pores of variable sizes and a shale-gas pore-network model. We coupled them into a workflow to study the effect of shale gas condensate on the gas permeability on selected pore networks, considering an empirical criterion of condensate pore bridging and gas flow and transport mechanisms.
The workflow makes use of Soave-Redlich–Kwong (SRK) EOS to model the fluid phase behavior, Zuo and Stenby's parachor based method to predict IFT and Pedersen's corresponding state model to predict viscosity by an iterative procedure underpinned by a modified negative flash algorithm. For any given pore network, this procedure is applied to calculate a full set of PVT properties for any confined reservoir fluid and to evaluate pore-bridging criterion for each and every pore element in the network before performing pore-network gas flow simulation. The pore network model is implemented to simulate the real gas flow in nanoscale porous media, taking into account the contributions from non-Darcy flow, adsorption, surface diffusion, and the formation of condensate bridges.
Using an Eagle Ford gas condensate sample, this study shows that the decreasing pore size has led to an increase in condensate dropouts in a nanoscale single cylindrical pore. The condensate liquid has moderately higher PVT properties compared to the gas phase. The differences of those properties between condensate and gas phases become smaller with the decrement of pore size. This trend appears to be opposite to those of a non-confined fluid in which condensate drops out due to pressure depletion. It has implied that the fluid confinement effect causes less flow resistance than pressure deletion even if the amount of dropouts are the same. The roles of both the pore space confinement and topology were examined on representative models. The results from the simulations on uniform pore networks show gas adsorption and surface diffusion have opposite effects and result in a minor net negative impact to the apparent permeability even when pore radius is less than 50 nm. The simulation results on a regular pore network with randomly distributed pore size show there is only a limit impact on apparent permeability as a result of condensate bridging in small pores, but with an increase in tortuosity, this impact increases.
Baek, Seunghwan (Texas A&M University) | Akkutlu, I. Yucel (Texas A&M University) | Lu, Baoping (Sinopec Research Institute for Petroleum Engineering) | Ding, Shidong (Sinopec Research Institute for Petroleum Engineering) | Xia, Wenwu (Harding Shelton Petroleum Engineering & Technology Limited)
Routine history-matching and reservoir calibration methods for horizontal wells with multiple hydraulic fractures are complex. Calibration of important fracture and matrix quantities is, however, essential to understand the reservoir and estimate the future recoveries. In this paper, we propose a robust method of simulation-based history-matching and reserve prediction by incorporating an analytical solution of production Rate Transient Analysis (RTA) as an added constraint. The analytical solution gives the fracture surface area contributing to the drainage of the fluids from the matrix into the fractures. The surface area obtained from the RTA is the effective area associated with the production—not total area. It is the most fundamental and the most significant quantity in the optimization problem. Differential evolution (DE) algorithm and a multi-scale shale gas reservoir flow simulator are used during the optimization. We show that the RTA-based optimization predicts the quantities related to completion design significantly better. Further, we show how the estimated total fracture surface area can be used to measure the hydraulic fracturing quality index, as an indication of the quality of the well completion operation. The most importantly, we predict that the fractures under closure stress begin to close much sooner (100 days) than the prediction without the RTA-based fracture surface area constraint. The deformation continues under constant closure stress for about 20 years, when the fractures are closed nearly completely. This work attempts to use the traditional reservoir optimization technologies to predict not only the reserve but also the life of the unconventional well.
Production analysis for low permeability reservoirs is crucial in characterizing flow regimes and reservoir properties. However, outliers, aberrant segments, and systematic noise often disturb and can cause significant bias. Additionally, incomplete sampling during system offline or failure can truncate useful information and reduce our capability to interpret reservoir characteristics. Uniform or non-uniform sampling with different sampling frequencies and increasing bytes in the dataset can also complicate the analysis. In this paper, we present a Denoising Long Short-Term Memory (DeLSTM) approach for noise reduction in production rate surveillance, and additionally provide a physics-based, deep-learning workflow based on a traditional polynomial window-based least square fitting - Savitzky-Golay (SG) filter, coupling with a Cascaded Long Short-Term Memory (CLSTM) Network, to facilitate surveillance of pressures and production rates.
Recorded pressures and rates are time-series data pairs with spatial dependency. Two production surveillance scenarios are considered. One is to simply monitor the production rates and perform traditional decline curve analysis (DCA) by assuming constant bottomhole pressure with boundary dominated flow (BDF) being reached. The other is to incorporate pressure data, and ensemble analyses could further improve the surveillance quality and production data analysis. We first present the DeLSTM algorithm solely to denoise the noisy rate data with existing outliers and aberrant segments and further reconstruct the missing rates in the first scenario. Then, to better preserve the characteristics of the original datasets with the presence of pressure information, we used SG-CLSTM with the superposition physics-based training features for rate reconstruction with high sampling frequency and incomplete sampling occurred in transient flow regimes. The proposed workflow was first applied to a synthetic case with random outliers, aberrant segments, and artificial white Gaussian noise to approximate the measured signals collected in the field. The missing segment was obtained by hiding a period of production data. The formed incomplete samples were then input into DeLSTM and SG-CLSTM for rate, pressure-rate reconstruction respectively. Then, the incomplete histories were generated and back-tested with the hidden dataset.
Gong, Yiwen (The Ohio State University) | Mehana, Mohamed (University of Oklahoma) | El-Monier, Ilham (The Ohio State University) | Xu, Feng (Research Institute of Petroleum Exploration and Development Co. Ltd. CNPC / China National Oil and Gas Exploration and Development Corporation) | Xiong, Fengyang (The Ohio State University)
The accurate estimation of the elastic properties of the rock is of great importance for designing a successful hydraulic fracturing. Among these properties, Young's modulus and Poisson's ratio essentially control fracture aperture and conductivity. However, the fissile nature of the shale rock largely challenges the mechanical properties measurement using a cylindrical core sample. While the nanoindentation technology can be applied to measure small chips of rock fragment, but reproducible experiments are required to provide an unbiased estimation. Herein, we are proposing a machine learning approach to predict the elastic moduli. We utilized an ensemble of data mining techniques and a database that include both the mineralogy and pore characteristics. Our results indicate that K-Means clustering yields best performance on data classification than all other tested methods while the elastic moduli estimation from Artificial Neural Network (ANN)is most accurate than Support Vector Machine (SVM), Multivariate Linear Regression (MLR) and Multivariate Adaptive Regression Spine (MARS). The dimension reduction became essential when then input datasets are remarkably correlated. The supervised learning techniques with our proposed approach leverage the usability of the lab experiment data and overcome disadvantages of the traditional elastic moduli measurement. It also further lands the far-reaching guide for the fracturing design.
Machine learning have recently revolutionized the oil and gas industry (Alcocer and Rodrigues 2001, Al-Fattah and Startzman 2001, Kohli and Arora 2014, Okpo et al. 2016, Sinha et al. 2016, Tariq et al. 2017, Luo et al. 2018, Nande 2018, Rashidi et al. 2018, Sidaoui et al. 2018, Xu et al. 2019). As a data-rich industry, machine learning finds applications in every corner ranging from production forecast to drilling efficiency (Hegde and Gray 2017, Fulford et al. 2016). Given the significance of geomechanical properties of the rock, the volume of studies has attempted to leverage machine learning techniques. For instance, Li et al. (2018) developed a workflow implementing various machine learning algorithm to accurately provide an alternative to synthesize the sonic logs and geomechanical properties afterwards. In the same time, Hadi and Nygaard (2018) used Artificial Neural Network (ANN) to develop an empirical model to estimate the shear velocity from conventional logs. Another dimension was presented by Jain et al. (2015) where they proposed an approach to integrate both core and log spectroscopy which provided better estimations of the mineralogy.