Content of PetroWiki is intended for personal use only and to supplement, not replace, engineering judgment. SPE disclaims any and all liability for your use of such content. A computer database management system, which includes remote sensing, mapping, cartography, and photogrammetry for conducting spatial searches and making map overlays.
We have seen that two data sets can have the same univariate statistics, yet have very different spatial properties (Figure 1). The complex attributes we deal with in the petroleum industry can be described by random functions that are combinations of regionalized and random variables. Regionalized variable theory is based on the statistics of the RV,    which differs from ordinary scalar random variables in its spatial continuity, yet still possesses the usual distribution statistics, such as mean and variance. The RV also differs in that it has a defined location.
Frac fluid delivery is selective in effect, so must fracture models. Here, a physics-based analytical model, called nine-grain model, is presented for production forecasting in multifrac horizontal wells in unconventional reservoirs, where the utilized formulation inherently enables defining three-dimensional non-uniform SRVs, selective frac-hits, and pressure- and time-dependent permeabilities. The model is validated by constructing case studies of liquid and gas reservoirs and comparing the results with numerical simulations. In cases with both production history and fracing-induced microseismic data available, the SRV's spatial structure is extracted using a hybrid four-level straight-line technique that links volumetric RTA estimations to morphometric microseismic analysis and entails plots of plasticity, diffusivity, flowing material balance and early linear flow. By applying our model to an oil well in Permian Basin, we demonstrate that the knowledge gained from the coupled microseismic-RTA contributes to resolving the non-uniqueness of RTA solutions. The proposed reservoir modeling procedure enables efficient incorporation of microseismic interpretations in modern RTA while honoring the SRV space-time variability, thus facilitates informed decision making in spacing design of wells and perforation clusters.
Frac-hits. A frac-hit can be defined as observing a perturbation in the well production rate and/or pressure that is induced by a child offset (or an infill) well, usually triggered by pressure sinks created around parent wells or high permeability lithofacies. A frac-hit that temporarily alters the parent well productivity is called a communication frac-hit, and those with long-term effects, generally caused by fracture interference, are referred as interference frac-hits. A frac-hit may also compromise the productivity of the child well itself since the existing pressure sinks distribute the fracing energy in a larger area and might lead to an asymmetric fracture growth around the child well. Besides the parent well operational condition, the microseismic monitoring of fracing can potentially indicate interference frac-hits as it reveals fracture overlaps and any preferential fracture dilation towards existing wells. Depending on the rock and fluid properties, well age, parent-child horizontal and vertical distances, and the spatial extent of Stimulated Reservoir Volume (SRV), the constructive (Esquivel and Blasingame 2017) or destructive (King et al. 2017, Ajani and Kelkar 2012) effects of frac-hits can be experienced by fractures, SRV or the entire drainage volume (stimulated and non-stimulated zones), usually by impacting rock multiphase fluid interfacial arrangements and/or changing dimensions of conductive fractures. Aside from prevention, thoroughly reviewed by Whitfield et al. (2018), it is essential to incorporate frac-hits into production forecasting models, which to date, is not yet as straightforward as their detection. Both types of frac-hits cause a change in the well productivity over time which is not necessarily correlated with pressure, and hence, complicate the reservoir modeling process.
Martini, Brigette (Corescan Inc.) | Bellian, Jerome (Whiting Petroleum Corporation) | Katz, David (Encana Corporation) | Fonteneau, Lionel (Corescan Pty Ltd) | Carey, Ronell (Corescan Pty Ltd) | Guisinger, Mary (Whiting Petroleum Corporation) | Nordeng, Stephan H. (University of North Dakota)
Hyperspectral core imaging studies of the Bakken-Three Forks formations over the past four years has revealed non-destructive, high resolution, spatially relevant insight into mineralogy, both primary and diagenetically altered that can be applied to reservoir characterization. While ‘big’ data like co-acquired hyperspectral imagery, digital photography and laser profiles can be challenging to analyze, synthesize, scale, visualize and store, their value in providing mineralogical information, structural variables and visual context at scales that lie between (and ultimately link) nano and reservoir-scale measurements of the Bakken-Three Forks system, is unique.
Simultaneous, co-acquired hyperspectral core imaging data (at 500 μm spatial resolution), digital color photography (at 50 μm spatial resolution) and laser profiles (at 20 μm spatial and 7 μm vertical resolution), were acquired over 24 wells for a total of 2,870 ft. of core, seven wells of which targeted the Bakken-Three Forks formations. These Bakken-Three Forks data (~5.5 TB) represent roughly 175,000,000 pixels of spatially referenced mineralogical data. Measurements were performed at a mobile Corescan HCI-3 laboratory based in Denver, CO, while spectral and spatial analysis of the data was completed using proprietary in-house spectral software, offsite in Perth, WA, Australia. Synthesis of the spectral-based mineral maps and laser-based structural data, with ancillary data (including Qemscan, XRD and various downhole geophysical surveys) were completed in several software and modelling platforms.
The resulting spatial context of this hyperspectral imaging-based mineralogy and assemblages are particularly compelling, both in small scale micro-distribution as well as borehole scale mineralogical distributions related to both primary lithology and secondary alteration. These studies also present some of the first successful measurement and derivation of lithology from hyperspectral data. Relationships between hyperspectral-derived mineralogy and oil concentrations are presented as are separately derived structural variables. The relationship between hyperspectral-based mineralogy to micro-scale reservoir characteristics (including those derived from Qemscan) were studied, as were relationships to larger-scale downhole geophysical data (resulting in compelling correlations between variables of resistivity and hyperspectral-mineralogy). Finally, basic Net-to-Gross calculations were completed using the hyperspectral imaging data, thereby extending the use of such data from geological characterizations through to resource estimations.
The high-fidelity mineralogical maps afforded by hyperspectral core imaging have not only provided new geological insight into the Bakken-Three Forks formations, but ultimately provide improved well completion designs in those formations, as well as a framework for applying the technology to other important unconventional reservoir formations in exploration and development. The semi-automated nature of the technology also ushers in the ability to consistently and accurately log mineralogy from multiple wells and fields globally, allowing for advanced comparative analysis.
Proper characterization of heterogeneous rock properties and hydraulic fracture parameters is essential for optimizing well spacing and reliable estimation of EUR in unconventional reservoirs. High resolution characterization of matrix properties and complex fracture parameters requires efficient history matching of well production and pressure response. We propose a novel reservoir model parameterization method to reduce the number of unknowns, regularize the ill-posed problem and enhance the efficiency of history matching of unconventional reservoirs.
Our proposed method makes a low rank approximation of the spatial distribution of reservoir properties taking into account the varying model resolution of the matrix and hydraulic fractures. Typically, hydraulic fractures are represented with much higher resolution through local grid refinements compared to the matrix properties. In our approach, the spatial property distribution of both for matrix and fractures is represented using a few parameters via a linear transformation with multiresolution basis functions. The parameters in transform domain are then updated during model calibrations, substantially reducing the number of unknowns. The multiresolution basis functions are constructed by eigen-decomposition of an adaptively coarsened grid Laplacian corresponding to the data resolution. High property resolution at the area of interest through the adaptive resolution control while keeping the original grid structure improves quality of history matching, reduces simulation runtime and improves the efficiency of history matching.
We demonstrate the power and efficacy of our method using synthetic and field examples. First, we illustrate the effectiveness of the proposed multiresolution parameterization by comparing it with traditional method. For the field application, an unconventional tight oil reservoir model with a multi-stage hydraulic fractured well is calibrated using bottom-hole pressure and water cut history data. The hydraulic fractures as well as the stimulated reservoir volume (SRV) near the well are represented with higher grid resolution. In addition to matrix and fracture properties, the extent of the SRV and hydraulic fractures are also adjusted through history matching using a Multiobjective Genetic Algorithm. The calibrated ensemble of models are used to obtain bounds of production forecast.
Our proposed method is designed to calibrate reservoir and fracture properties with higher resolution in regions that have improved data resolution and higher sensitivity to the well performance data, for example the SRV region and the hydraulic fractures. This leads to a fast and efficient history matching workflow and enables us to make optimal development/completion plans in a reasonable time frame.
Multiple point statistical (MPS) simulation is a modern pattern-based geostatistical approach for describing and stochastically simulating geologic formations with complex connectivity patterns. In MPS geostatistical simulation, a template containing data patterns around each simulation cell is used to extract and store the local conditional probabilities from a training image (TI). To generate a simulated sample, a random path is generated to sequentially visit all unsampled grid cells and draw conditional samples from the corresponding stored conditional probabilities. The grid-based implementation of MPS simulation offers several advantages for integration of hard and soft data. In the Single Normal Equation SIMulation (SNESIM) implementation of MPS for facies simulation, it has been observed that the integration of soft data can result in many facies realizations that do not provide consistent patterns with the incorporated probability map. This is partly explained by the Markov property that only considers probabilities that are co-located with the simulation node, and hence ignoring spatial information from neighboring cells. In addition to this effect, we show another important mechanism is in play in the SNESIM algorithm that explains the observed behavior. Specifically, at the early stage of the simulation when the first few percentage of the simulation nodes on the random path are visited the local conditioning data are limited and the resulting conditional probabilities that are obtained from the TI are not strictly constrained. Hence the conditional probabilities cover a wide range of values in the range [0,1]. However, after this initial stage, as the simulated data populate more cells in the model grid, they tend to severely constrain the conditional probabilities to assume extreme values of 0 or 1. With these extreme values at the later stages of the simulation the probability values that are included in the soft data (as secondary source of information) tend to be disregarded and the facies types are predominantly determined by the TI. We demonstrate and discuss this behavior of the SNESIM algorithm through several examples and present strategies that can be adopted to compensate for this effect. The presented examples are related to indirect integration of the flow data by first inferring probabilistic information about facies types and using the results as soft data for integration into SNESIM algorithm.
Han, Rubing (Research Institute of Petroleum Exploration and Development, Petrochina) | Li, Shunming (Research Institute of Petroleum Exploration and Development, Petrochina) | Song, Benbiao (Research Institute of Petroleum Exploration and Development, Petrochina) | Tian, Changbing (Research Institute of Petroleum Exploration and Development, Petrochina)
The giant H oil field was discovered in 1976 and put into development in 2008. The Mishrif reservoir has half of the geological reserves, and the current development challenges are greater. First, the sedimentary evolution is very complicated (inner ramp, lagoon, shallow sea, et al), the existing facies model is generally derived from rock type interpolation with kriging algorithm, and does not reflect sedimentary understanding, and is quite different from the actual situation displayed by seismic data. Furthermore, the variogram of rock types is also difficult to be accurate, resulting in poor fitting of reservoir simulation. Second, the seismic data quality is generally low, and its correlation with rock types is worse. If it is directly used to constrain rock type interpolation, the accuracy is generally unsatisfactory.
In order to solve these problems, this study first comprehensively studied the core, thin slices, scanning electron microscopy, X-ray diffraction, electron probe and other data for sequence stratigraphy and sedimentary research. Secondly, based on the single well facies interpretation, combined with the inversion data, the dimension, spatial relationship and variogram of each depositional element were studied. Thirdly, the facies model was built using the Sequential Indicator Simulation (SIS) algorithm, and was manually modified zone by zone according to the sequence stratigraphy and sedimentary understanding. Finally, the relationship between depositional elements and rock types was studied, and the probability bodies of different depositional elements with different rock types were obtained. Based on this, the SIS algorithm was used to simulate rock types. Then this could be further used as a constraint to build petrophysical models.
The results showed that the correlation between the depositional elements and the rock type was obviously better than the inversion wave impedance. The artificially modified facies model could better reflect sedimentary appearance, and various static and dynamic data. The rock type model obtained under the constraint of the facies model also bypassed the problem that the seismic data quality was not good enough for direct rock type interpolation.
It was verified that with the new modeling workflow, the preliminary numerical simulation fitting rate reach over 80%. The facies model was then verified with water flooding test in H oil field and had good results. This workflow could provide a good reference for similar oilfields in the Middle East.
C. H. Stone, W. W. Fleckenstein, and A. W. Eustes, Colorado School of Mines Summary The United States National Science Foundation has funded a sustainability-research network focused on natural-gas development in the Rocky Mountain region of the United States. The objective of this specific study is the assessment of the use of existing water wells to monitor the risk of contamination by the migration of fracturing fluids or hydrocarbons to freshwater aquifers. An additional objective of the study is to modify existing risk estimates using the spatial relationships between the existing water wells and producing oil wells. This will allow estimates of single-barrier failure and multiple-barrier failure, resulting in contamination projections for oil and gas wells in areas without surrounding water wells to detect migration, dependent on well-construction type. Since 1970, the Wattenberg Field in Colorado has had a large number of oil and gas wells drilled. These wells are interspaced tightly with agricultural and urban development from the nearby Denver metropolitan area. This provides a setting with numerous water wells that have been drilled within this area of active petroleum development. Data from 17,948 wells drilled were collected and analyzed in Wattenberg Field, allowing wells to be classified by construction type and analyzed for barrier failure and source of aquifer contamination. The assessment confirms that although natural-gas migration occurring in poorly constructed wellbores is infrequent, it can happen, and the migration risk is determined by the well-construction standards. The assessment also confirms that there has been no occurrence of hydraulic-fracturing-fluid contamination of freshwater aquifers through wellbores. The assessment determines both the spatial proximity of oil and gas wells and surface-casing depth to water wells to then determine the utility of water wells to monitor migration in oil wells. Introduction The Wattenberg Field in the Denver-Julesburg Basin, Colorado, began oil and gas production in 1970. The field is the most-active oil and gas field in Colorado and is bordering the highest-population area of the state in the Denver metropolitan area (Figure 1).
The field of data-driven analytics and machine learning is rapidly evolving today and slowly beginning to reshape the petroleum sector with transformative initiatives.
This work describes a heuristic approach combining mathematical modeling and associated data-driven workflows for estimating reservoir pressure surfaces through space and time using measured data. This procedure has been implemented successfully in a giant offshore field with a complex history of active pressure management by water and gas.
This practical workflow generates present-day pressure maps that can be used directly in reservoir management by locating poorly supported areas and planning for mitigation activities. It assists and guides the history matching process by offering a benchmark against which simulated pressures can be compared. Combined with data-based streamlines computation, this workflow improves the understanding of fluid flow movements, help to identify baffles and assists in field sectorization.
The distinctive feature of this data-driven approach is the unbiased reliance on field-observed data that complements complex modeling and compute-intensive schemes typically found in reservoir simulation. Conventional dynamic simulation and the tracing of streamlines require adequate static (e.g. permeability tensor) and dynamic models (e.g. pressures for each active cell in the system).
Alternatively data-driven streamlines are readily available and calibrated.
This paper presents innovative algorithms and workflows to the relatively limited existing body of literature on data-driven methods for pressure mapping.
In this case study, new insights are effectively revealed such as inter-reservoir communication, enabled a better understanding of the gas movement and supported the change in production strategy.
The paper is organized as follow. After a general overview of the field studied, this paper describes in detail the workflows used to interpolate pressures in space and time along with cross-validation results. Various applications of the pressure predictions are presented in the sections thereafter.