Shales are the most commonly found sedimentary rocks on Earth. Most US shale plays are massive, with different maturity regions and varying prospects. There has been a paradigm shift in the understanding of shale anisotropy and microstructures in the last decade, with a high focus on the identification of sweet spots and optimal reservoir-quality zones. Rock typing is one of the sought-after techniques to achieve this objective, and it has become an integrated part of the unconventional-characterization work flow.
In this work, rock typing was performed with an integrated work flow using laboratory petrophysical measurements. The rock types were derived with machine-learning clustering algorithms—namely, K-means and self-organizing maps (SOM). The integrated work flow was applied in three different shale plays: Eagle Ford, Barnett, and Woodford.
Three different rock types were identified. In general, Rock Type 1 had the highest porosity and total organic carbon (TOC), indicative of highest storage and source-rock potential, respectively. Rock Type 1 was also the key rock type controlling the production. Rock Type 2 had intermediate porosity and TOC, whereas Rock Type 3 had the lowest porosity and TOC.
Next, core-derived rock types had to be scaled up to logs. Support vector machines (SVM), a classification algorithm, was used for scaling up. It was trained with a data set consisting of depths at which both core and log data were available. Different logs such as gamma ray, resistivity, neutron, and density were used for scaling up. Finally, a rock-type ratio (RTR) was defined from rock-type logs based on fraction of Rock Type 1 over gross thickness. The ratio thus developed was found to have a strong correlation with normalized oil-equivalent production rate.
In total, 22 wells with core data were considered for rock typing in the three shale plays. The rock types were scaled up to 95 wells at a cumulative over a 20,000-ft depth interval. The work flow shown in this paper can easily be extended to other data sets in other plays. The manual approach, on the other hand, can be prohibitively time-consuming.
Sinha, Saurabh (University of Oklahoma) | Marfurt, Kurt J. (University of Oklahoma) | Devegowda, Deepak (University of Oklahoma) | Pires de Lima, Rafael (University of Oklahoma) | Verma, Sumit (University of Texas, Permian Basin)
Estimation of stimulated rock volume (SRV) is the cornerstone offield development planning in shale reservoirs. The EUR has a first order dependency on the SRV and therefore its estimation is extremely critical for field development.
In this paper, we propose a methodology to estimate the SRV and hence the EUR for a shale reservoir using seismic data, flow and geomechanical simulation. The backbone of our methodology is seismic inversion coupled with geomechanical simulation. We apply our technique to data acquired from the Barnett shale.
In this work, we first use 3D seismic and sonic logs to perform pre-stack seismic inversion. Then, we derive the distribution of Poisson's ratio and Young's modulus in the area of interest (AOI). We constrain the porosity in our geo cellular model using a rock type model. Our rock type model for this work is based on k-means clustering on multi-well log analysis.
We modeled a well in the AOI for which microseismic data is available. Weused a coupledflow and geomechanical simulator to mimic the fracturing process and the fluid volumes injected during the actual completion of the well. For geomechanical coupling, we used Barton-Bandis model in seismic inversion-derived Young's modulus and Poisson's ratio 3D volumes. Next, we compare our results with the SRV obtained by an analysis of microseismic data. We reconcile differences in the model-derived SRV and then calibrate the resulting flow model and use the history-matched model for forecasting production.
Our results indicate an excellent match on SRV and therefore production data. Because we usevariable geomechanical parameters along the lateral, we observe irregular SRV's and drainage areas consistent with the microseismic data. Our methodology for predicting microseismic can be used for asset evaluation, acreage prioritization and to optimize the completion design in unconventional plays.
Shales are the most commonly found sedimentary rocks on earth. Most US shale plays are massive with different maturity regions and varying prospects. There is a paradigm shift in the understanding of shale anisotropy and micro-structure in the last decade. The focus has now shifted on identifying the sweet spots and optimum zones for completion. Rock Typing is one of the sought-after techniques to achieve this objective and it has become an integrated part of the unconventional characterization workflow.
In this work, rock typing was done using an integrated workflow utilizing laboratory petrophysical measurements. The rock types were derived using machine learning clustering algorithms namely K-means and Self Organizing Maps (SOM). The integrated workflow was applied in three different shale plays namely Eagle Ford, Barnett, and Woodford.
Three different rock types were identified. In general, Rock Type 1 had the highest porosity and Total Organic Carbon (TOC) indicative of highest storage and source rock potential, respectively. Rock Type 1 was also the key rock type controlling the production. Rock Type 2 had intermediate porosity and TOC while Rock Type 3 had the lowest porosity and TOC.
Next, core derived rock types had to be upscaled to logs. Support Vector Machines (SVM), a classification algorithm was used for upscaling. It was trained using a dataset consisting of depths at which both core and log data were available. Different logs like gamma ray, resistivity, neutron and density were used for upscaling. Finally, a Rock Type Ratio (RTR) was defined from rock type logs based on fraction of Rock Type 1 over gross thickness. The ratio so developed was found to have a strong correlation with normalized oil equivalent production rate.
A total of 22 wells with core data were considered for rock typing in the three shale plays. The rock types were upscaled to 95 wells at a cumulative over 20,000 ft. depth interval. The workflow shown in this paper can easily be extended to other datasets in other plays. The manual approach on the other hand can be prohibitively time-consuming.
This work was motivated by field observations of two infinite-acting flow regimes in multi-fractured horizontal wells that are seen as two straight lines on a cross plot of cumulative production versus the square root of time. The occurrence of one linear flow regime followed by another, titled linear post linear flow, is found to be the most prevalent signature of production data analyses in several plays such as the Permian Basin, the Eagle Ford Shale, Bakken Shale, SCOOP and STACK plays of Oklahoma.
The objectives of this study were to determine why this phenomenon occurs and its implications for forecasting production and determining reserves. Numerical simulations representing an element of symmetry along a multifractured horizontal well containing three fractures were run with varying fracture spacing and length. The simulations were successful in predicting linear post linear flow.
Classic analysis for multi-fractured horizontals contain two flow regimes - infinite acting and boundary dominated flow as shown in Fig. 1. The cumulative production versus square root of time analysis was suggested by Wattenbarger et al. (1998) and utilized by Rodrigues and Callard (2012). The curve during boundary dominated data is generating by an Arps (1956) hyperbolic forecast.
The contribution from outside the Stimulated Rock Volume (SRV) to well performance is generally limited in shale wells (Ambrose et al. 2011) necessitating optimal well and fracture spacing. The well spacing is often also governed by the recovery factors per well or recovery factors per lease in the case of PAD wells. In this work, we quantify recovery factors for downspaced shale wells using a reservoir simulation approach supported by rate transient analysis (RTA) and geomechanical and petrophysical studies.
Our analyses is based on data from a PAD well in the Eagle Ford shale. First we identify different petrofacies in the area of study from well logs. These are then populated in a 3D reservoir simulation model with the corresponding rock type (RC) constrained porosity and water saturation. Permeability estimates are derived from an RTA-based workflow. The well is completed in the rich gas condensate window of the Eagle Ford and an equation of state (EOS) model is calibrated to available laboratory data.
To mimic the fracturing process we have implemented the modified Barton-Bandis model in flow simulation suggested by Tran et al. (2009) to model the opening and closing of fractures and the corresponding effective stress changes during injection/production. We couple the geomechanical model iteratively into the flow model to solve for both stresses and fluid flow in the reservoir and history match gas rates, condensate rates, water rates and flowing bottomhole pressure (FBHP) simultaneously.
We use the history matched model to run sensitivities for both upspacing and downspacing scenarios with respect to conventional 500 ft. spacing. In all sensitivity scenarios, we monitor the growth of fractures with the actual pumping schedule to investigate fracture growth and possible completion strategies. Finally, we summarize our results for downspacing and upspacing of wells. Our results show that the optimal well spacing in the area of study in the Eagle Ford is 420 Ft.
High productivity wells are obtained by drilling in sweet spots and having optimum fracture treatments. Reservoir characterization is key to identify both the sweet spots and optimum completion zones. Rock typing is an integral part of the reservoir characterization workflow which identifies different flow units. In this work, we present an integrated workflow for rock typing using lab petrophysical measurements and well logs acquired from six wells in the Wolfcamp formation in the Permian basin. Only three wells had cores. The petrophysical measurements were sampled every 3 feet interval from 900 feet of continuous core recovered from these three wells. Unsupervised clustering algorithms like K-means and Self Organizing Maps (SOM) were used to define rock types.
Rock Type 1 is characterized by the highest porosity (7-10%) and TOC (4-6%). Not surprisingly, Rock Type 1 had the highest positive impact on well productivity. Rock Type 2 had intermediate values of porosity (6-8%) and TOC (3-4.5%) and moderate source potential and storage. Rock Type 3 had the highest carbonate (60-80%) content and poor storage and source rock potential (2-4% TOC, 4-7% porosity). Rock Type 3 was the worst rock type.
A classifier from well log data that is conditioned to core data was created. The classifier was applied at depth locations where core data were not available. The classifier included GR and neutron porosity logs. These logs were used as these were the commonly available logs in all the wells used in the study. We define a rock type ratio (RTR) based on the fraction of Rock Type 1 over total perforation. RTR was found to strongly correlate with oil production rate. The advantage of the workflow developed here is that it can easily be generalized to diverse data types and other plays in different geologic settings.
The improvement in the rate of success of wildcat drilling from 75% in 1974 to 95% in 2010 has been widely attributed to improvements in reservoir characterization techniques (Williams 2008). One of these techniques, rock typing, is central to meaningful interpretation of the diverse data types acquired over diverse length scales and with varying resolutions. This is true for both conventional and unconventional reservoirs.
In this paper, we present the results of an elaborate simulation study from the Eagle Ford shale by integrating rate transient analysis (RTA), microseimic interpretation and rock typing to quantify the well recovery factors.
First,we builda fine-scale geocellular model to represent the lithology of Eagle Ford Shale (EFS)in interest. A fully compositional fluid model wasthen integrated into the geocellular model. The stimulated rock volume (SRV) is based on microseimic data interpretation. The permeability of the system is calibrated from rate transient analysis (RTA) using the methodology suggested by (
Our results show that the conventional 500'spacing is not optimal in the rich condensate area of the Eagle Ford shale. Additionally, there were little to no difference in the predicted EUR's in staggered or same layer completions. Microseismic data suggests that the fracture grows up to the top of EFS and is not limited to the lower EFS. We also perform sensitivity studies with respect to drawdown that show minimal or no condensate banking with choke management. Early-time condensate banking is however very sensitive to higher drawdown pressures.
Unconventional resource plays have shaken the energy industry in last decade changing the energy import – export equation especially in United States. US shale plays now contribute towards 40% of the total natural gas production (
In this paper we have described the application of multivariate statistical techniques to generate type curves (TC's) for the Arp's decline analysis. Our study includes more than 1500 wells from different fluid type's (dry gas, lean gas, rich gas and light oil) in eagle ford shale (EFS).
Data such as well logs, completion, production and geology is included in the analysis as input variables. The analysis demonstrates the application of multivariate statistical techniques such as principal component analysis (PCA), k means clustering, self-organizing maps (SOM) and multivariate regression (MVR).
Our analysis show that univariate statistics is insufficient for analyzing the data due to significant amount of heterogeneity in shale reservoirs. Using univariate statistics impact of different variables such as well spacing, condensate gas ratio (CGR), perforated lateral lengths, zipper fracture etc. cannot be accounted in a holistic way. Multivariate analysis can quantify the impact of these variables and can be used as a good predictive tool for determination of Arp's parameters.
The objective of this paper is to integrate "Big Data" concept with petroleum engineering knowledge for the prediction of recovery factor in Deepwater Gulf of Mexico (dGOM) oilfields. Recovery factor is affected by many geological and engineering factors; as a result, there is no explicit approach to accurately calculate the recovery factor. This is particularly true for deepwater development as the parameters associated with the recovery factor estimation have significant uncertainties and usually extremely costly to obtain. Typically, the recovery factor of a field is estimated using analogs, material balance, decline curve or numerical simulation. These deterministic approaches requires representative geological models. However, enough information is often not available to capture the realistic flow. In addition, the estimated recovery factor can be very different using different methods. Reservoir engineers are faced with the challenging task of estimating recovery factor by optimizing a large number of parameters with limited, sometime inaccurate information. This dilemma calls for an alternative approach in handling the noisy data.
Data mining and classification identify hidden patterns in unstructured data and tend to be fairly robost in the presence of noisy data. Using a database of 395 Deepwater Gulf of Mexico (dGOM) oilfields with 84 attributes, a set of dimensionless numbers are calculated for 59 oilfields with water drive. This helps in dimensionality reduction and scaling of reservoir models for comparison. Based on the distribution of dimensionless numbers, data mining techniques like K-means clustering followed by principal component analysis (PCA) are used for classifying oilfields into four categories. Subsequently, partial least square (PLS) regression is used for relating dimensionless numbers to recovery factor from sparse data in dGOM, which gives good coefficient of correlation for some clusters.
This paper shows that dimensionless numbers together with data mining techniques provides a new, easy to implement method for predicting recovery factor for large datasets where application of other methods are limited due to requirement of high computational cost and time.
Verma, Sumit (The University of Wyoming) | Zhao, Tao (The University of Oklahoma) | Marfurt, Kurt J. (The University of Oklahoma) | Devegowda, Deepak (The University of Oklahoma) | Grana, Dario (The University of Wyoming)
The Barnett Shale in the Fort Worth Basin is one of the most important resource plays in the USA. TOC and brittleness can help to characterize a resource play to assist in the search for sweet spots. Higher TOC or organic content are generally associated with hydrocarbon storage and with rocks that are ductile in nature. Brittle rocks, however, are more amenable to fracturing with the fractures faces more resistant to proppant embedment. Productive intervals within a resource play should therefore contain a judicious mix of organics and mineralogy that lends to hydraulic fracturing. Identification of these intervals through core acquisition and lab-based petrophysical measurements can be accurate but expensive in comparison to wireline logging. In this work, we estimate TOC from wireline logs using Passey's method and attain a correlation of 60%. However errors in the base line interpretation can lead to inaccurate estimates of TOC. Using non-linear regression with Passey's TOC, normalized stratigraphic height and acquired wireline logs the correlation was increased to 80%. This regression can be applied to uncored wells with logs to estimate TOC and thereby provides ground truth within the seismic survey. Core measurements provide relatively more accurate measures of both TOC and mineralogy. Brittleness index (BI) is computed based on mineralogy using Wang and Gale's formula. The correlation between this mineralogy based BI and BI estimated using elastic logs (λρ, μρ, VP/ VS, ZP and ZS) and wireline logs is 78%. However, this correlation decreases to 66 % if the BI is estimated using only wireline logs. Therefore, the later serves as a less reliable proxy. We correlate production to volumetric estimate of TOC and brittleness by computing distance weighted averages about assumed perforations in 120 horizontal wells. We obtained a production correlation of 38% on blind wells, which was encouragingly suggesting that the geologic component in completions provides an important contribution to well success.