Artificial neural networks are becoming increasingly popular in the oil and gas industry. In the past, studies have been done on the use of artificial neural networks in reservoir characterization, field development and formation damage prediction, to name a few. The aim of this study is to provide guidelines to successfully develop and train an artificial neural network (ANN) that will predict reservoir properties that can give an improved history match when input into a reservoir simulation model. An ANN was developed to improve the history match with a ‘small' number of simulation runs for a reservoir that produced oil, gas and water for a period of ten years. Due to a lack of specific protocols for this type of study, the trial and error process was utilized to establish guidelines and suggestions.
The neural network was developed by using an inverse solution method to formulate the training and testing data. Normalization of the data simplified the neural network, improved its effectiveness and enhanced its performance. The feed-forward network with back-propagation and the hyperbolic tangent sigmoid function (tansig) in the hidden layers of the network proved to be most effective in the training/learning process.
Results indicated that functional links and eigenvalues of various system related matrices were effective in the training/learning process. These provided the network with the necessary connections that linked the inputs to the required outputs. It was necessary to input production differences between the historical and simulated performances at specific times to successfully train the network and predict realistic property values for the reservoir. Data structure and production time intervals influenced the training time as well as the accuracy of the predictions. If time intervals were too short, training times were longer, memorization occurred, and the network could not accurately predict the reservoir's properties. Most of the effective functional links that were successful in the training/learning process included relationships between permeability and other factors such as porosity, areas of the regions in the reservoir and the distances from the producer to the boundaries of the reservoir.
The M4.1 reservoir in the Tahoe Field located in the Gulf of Mexico was used as a case study to illustrate the use of ANNs in decreasing the amount of numerical reservoir simulations required to obtain an improved history match. The effective parameters, obtained from network development, were applied to data from the M4.1 reservoir simulations to determine which functional links and architecture would be most effective in training the network. It was observed that some of the functional links and network structures that were effective in network development were also effective in the ANN developed for the M4.1 reservoir while some were not.
Artificial neural networks are information processing systems that are a rough approximation and simplified simulation of the biological neuron network system. The first practical application of ANNs came in the late 1950s when Frank Rosenblatt and his colleagues demonstrated their ability to perform pattern recognition1. However, interest in neural networks dwindled due to its limitations as well as the lack of new ideas and powerful computers1. With some of these hurdles overcome in the 1980s, and with the development of the back-propagation algorithm for training multilayer perceptron networks, there was a renewed interest in the field. Since then, ANNs have been improved and applied in aerospace, automotive, defense, transportation, tele-communications, electronics, entertainment, manufacturing, financial, medical and the oil and gas industry, to name a few.
In recent years, there has been a growing interest in applying ANNs to petroleum engineering. ANNs in the oil and gas industry are based on supervised training algorithms that have the potential for solving many of the challenging and complex problems in the oil and gas industry2. Previously, some of the studies done on the applications of neural networks have been in reservoir characterization, field development, two-phase flow in pipes, and identification of well test interpretation models, completion analysis, formation damage prediction, permeability prediction and fractured reservoirs.
This work is the result of the authors' several experiences in sandy reservoirs of clay minerals matrix in the San Jorge Basin. It is intended to highlight the advantages of the use of integrated reservoir models originated from the group of ordinary lithologic characteristics among reservoirs, their integration to sedimentary subambients inferred from logs, and high technology log data (spectral gamma ray and magnetic resonance).
The majority of the reservoirs of the Bajo Barreal formation are volcanoclastic sandstones with porosities modified and/or reduced with burial and as diagenesis grades increase. That decrease mainly takes place due to three processes: mechanic compaction, dissolution of grains by intergranular contacts, and pore cementation.
The quantization of the compaction is influenced by the abundance and type of lithic material. The use of technologies that make it possible to have a detailed estimate of the lithology (Spectral Gamma Rays, Lateral Impact and Rotated Cores) is of pre-eminent importance when evaluating that type of reservoirs.
The diagenetic conditions and processes directly affect porosity determination from conventional logs. This justifies the use of porosity tools that are independent from the type of material of the reservoir rock. However, the combination of porosity data obtained in the NMR together with lithologic determinations and appropriate logs results into an interesting alternative to improve the evaluation of sandy reservoirs.
Generally, it is considered that volcanoclastic sandstones have a poor potential as oil reservoirs, because of their low porosity and permeability due to compactational processes and precipitation of authigenic mineral, such as cement. But thanks to the high reactivity of their materials with fluids from the reservoir, secondary important porosities are developed making the reservoir a high quality one. For this reason, these processes can be used to indicate the quality of volcanoclastic sandstones.
The implementation of methodologies, as the ones in this work, endeavors to apply evaluation criteria of sandy reservoirs.
A rapid and an effective reservoir simulation model was built based on limited information. 3D seismic impedance, two exploratory wells' log interpretation, a core data, and a well test consisting of an isochronal test and, several months later, an extended flow followed by long build up test were basic available data for this field. The main objective of this study is to estimate recoverable reserve with or without hydraulic fracturing in Mulichinco formation at the Paso Del Indio Field.
Material balance and decline curve analyses have important limitations to estimate ultimate recoverable reserves in the tight gas reservoirs. Well types that are derived from the conceptual simulation models do not reflect the effective drainage area or permeability heterogeneity in the field. A representative permeability in well spacing area should be averaged harmonically or geometrically. In order to estimate the ultimate recoverable reserves in the tight gas reservoirs, permeability heterogeneity or the effective available drainage area to hydraulic fractures should be simulated effectively. Relatively small changes in permeability can results in unsuccessful fracture design and in uneconomical flow rates.
A very quick and effective simple reservoir simulation model was established to estimate recoverable reserves rather than using conventional volumetric, material balance, and decline curve analysis in tight gas reservoirs. Not having any production histories, well test information was used very successfully as history matching information to validate the geological, petrophysical, and PVT models.
An Artificial Neural Network (ANN) was designed and tested in the present study to examine the correlation between permeability estimations and porous medium properties, such as porosity, specific surface area, and irreducible water saturation. The network developed in this work is a predictive tool that uses soft computing techniques to estimate absolute permeability of carbonate reservoirs. The Artificial Neural Network toolbox of MATLAB® R2006b and the Feed Forward Error Back Propagation methodology were used in the construction of the network. Carbonate reservoir field data presented in the literature were utilized in the training, testing, and validation of the proposed model. The present study indicates that ANN generated permeability values are consistent with those obtained from core analysis. Results from this study confirm the complex relationship among permeability, porosity, specific surface area and irreducible water saturation of carbonate reservoirs, and suggest that variations in specific surface area affect the magnitude of irreducible water saturations, thus creating an apparent dependence of permeability on irreducible water saturation.
Additional observations support a direct relationship between porosity and permeability, and an inverse relationship between specific surface area and permeability.
Porosity-permeability relationships are of great importance for the reservoir engineer because of the difficulties and uncertainties associated with direct permeability interpretations from well-log data. Accurate permeability predictions provide engineers with the ability to design and manage efficient processes in the development of oil and gas fields. Although it is generally accepted that permeability is closely related to porosity, their relationship cannot be captured by a simple expression. Absolute permeability is a dynamic flow property, while porosity is a measure of the storage capacity of a rock, a static rock property. The absolute permeability of a porous medium varies with grain size, sorting, cementing, direction, and location; thus the scatter quality of permeability plots.
A wide range of permeability correlations using pore- and field-scale models are presented in the literature1-3. Starting with the seminal works by Kozeny4 and Carman5, many different correlations have been proposed between porosity and permeability. The Kozeny-Carman equation was developed for a porous medium represented by a bundle of uniform capillary tubes and introduces a direct dependence between porosity and permeability, while accounting for specific surface area and tortuosity as a measure of flow resistance.
For unconsolidated porous media with variable particle size, Panda and Lake6 propose a modification of the Kozeny-Carman equation to express permeability in terms of particle-size distribution characteristics and the bulk physical rock properties. They found reasonable agreement between predicted and experimental permeability, relying on appropriate estimations of surface area, and demonstrated the modest impact of sorting on the quality of their predictions. With respect to sorting, porosity tends to increase for perfectly sorted media and decrease as sorting becomes poorer7, thus affecting permeability.
Chemical flooding is a process that improves recovery efficiency based on injection of a chemical system that contains surfactants and polymers. Surfactants are substances that alter the interfacial tension and promote a miscible displacement with residual oil. Polymers decrease the water mobility and, in this process, it is injected after the surfactant injection to promote an immiscible displacement of the micellar bank. Diverse operational and reservoir parameters can influence the micellar injection process. This work analyses the oil recovery efficiency of this process in reservoirs with characteristics of existing reservoirs in the northeast region of Brazil, identifying the optimal applicability parameters and the possible technical viability with these characteristics. The numerical study was done using the commercial software "Stars -CMG" (Computer Modelling Group).
The reservoir parameters evaluated were: porosity, permeability, oil viscosity and initial oil saturation, according to some Brazilian northeast reservoirs characteristics, in order to obtain the applicability of the process with these reservoir characteristics, and the optimal application conditions. Then, an analysis of the operational parameters was carried through, with the objective of optimizing the process. The results showed improved recovery efficiencies for diverse reservoir characteristic encountered in the State of "Rio Grande do Norte??. For the cases and reservoir conditions studied, the most critical factor that affects the micellar process was initial oil saturation, and the process optimization showed an increase on cumulative oil and on recovery factor. An increase in oil recovery with small costs is the search of petroleum industries together with academic institutions because of uncertain oil price. Therefore, it may be necessary the study of different processes to improve recovery efficiency such as chemical flooding.
Micellar/polymeric solution injection process in oil reservoir is one of chemical methods used at the moment. The method consists on promoting the displacement of the residual oil trapped in the reservoir, through the injection of an aqueous solution with surfactant and polymer. This technology is based on the concept of interfacial tension between two fluids, where the tensoactive agent promote the decrease of this tension among the fluids that are inside the reservoir, favoring the displacement of residual oil with the injected of solution. The procedure consists of injecting the solution with the purpose of forming a fluid bank that displaces the oil with low interfacial tension, see Figure 1.
The surfactant is the responsible one for the decrease of the interfacial tension, since it is a substance that adsorbs at the interface between the two liquids. The polymer role in the solution is to promote the decrease of water mobility, since small concentrations of this substance cause a great increase of the water viscosity. The mobility reduction reduces the formation of fingering of the injected solution through the oleic phase ahead.
We present an autocorrelation estimator based on the dispersion variance (DVA) as an alternative to the classic semivariogram (CSV) used in Geostatistics to determine spatial distributions of reservoir rock properties. We evaluate DVA behavior with 2-D synthetic and field data sets, and we compare its performance to that of CSV. DVA turns out reliable to estimate spatial autocorrelation and anisotropy in two dimensions, it is more efficient than the classic estimator and it is very smooth and easy to fit with a theoretical model. We also use a set of porosity field measurements and a cross-validation technique to validate an autocorrelation model obtained with DVA. We conclude that DVA is a powerful tool to determine spatial autocorrelation.
An important area of Reservoir Engineering is Reservoir Characterization, which studies the characteristics and properties of reservoir rocks like porosity and permeability. Geostatistics is a powerful tool in Reservoir characterization, because it provides a series of techniques to interpolate throughout the whole reservoir the values of rock properties measured only at wells locations. One of the most critical aspects of Geostatistics is the determination of spatial autocorrelation of properties. The classic semivariogram (CSV) is the most widely used autocorrelation estimator1-2, though it is neither precise for long distances nor robust with atypical data and non-Gaussian distributions.
In this work we present an alternative autocorrelation estimator, DVA, to overcome these shortcomings. DVA is based on the dispersion variance defined within moving windows of different sizes. DVA can be applied to two and three-dimensional data sets that are not regularly distributed in space, which is of great practical interest.
We first compare DVA's behavior to that of CSV's for synthetic two-dimensional data sets. To create this data, we generate fields in two dimensions, representing average values of porosity measured at 50 well locations irregularly scattered in an oil area. These values are randomly sampled from a much denser synthetic grid previously generated with a known spatial autocorrelation. This dense grid represents the whole original reservoir porosity whose autocorrelation we want to determine. We compare the ability of DVA and CSV to recover the known autocorrelation from each randomly sampled data set.
We also analyze a 2-D field data set of average porosity measurements in a North Sea deposit. We determine the autocorrelation by means of DVA. In this case, using a cross-validation technique, we test the reliability of DVA as a method to characterize the spatial structure3-4. This technique is used to verify the ability of a particular semivariogram model to describe the autocorrelation of data. It compares the original measured values to those estimated with the kriging algorithm. This algorithm calculates a kriged value for each location where there is a measured property, using the measured values at other locations and a semivariogram model. By comparing the measured and kriged value at each location, the accuracy of the semivariogram can be evaluated.
The study of the spatial autocorrelation is vital to establish a suitable model of spatial variability and anisotropy. Analyzing the synthetic data, we see that DVA is smooth and more stable than CSV at long distances, and is better to be fitted by a theoretical model. It is able to detect the global statistics of data while CSV is more influenced by the particular characteristics of each measured set. In the field case, the predictions made with DVA turn out to be more accurate than those made with CSV. We finally conclude that DVA allows developing a more reliable autocorrelation model than CSV, detecting the global statistics of data and making better predictions.
This paper describes the implementation of a petroelastic model (PEM) based on Gassmann's equation to calculate seismic attributes into a commercial reservoir flow simulator. This implementation is the first step of a project to integrate time-lapse (4D) seismic attributes into an assisted history matching tool developed in a previous project.
The paper includes the description of the PEM and some implementation issues, such as the coupling of the model with the flow simulator with the purpose of using its basic calculated properties, discuss some user options (such as properties input through correlation or geoestatisticaly obtained maps) and the model variants and extensions (such as lithology influence and pressure effects). Three applications of this petroelastic model are shown: the first is a synthetic model based on outcrop data; the second is a 4D feasibility study for water injection monitoring in an offshore field; and the last one is a comparison between observed and calculated pressure impedances for an offshore field.
The resulting tool is applicable, for example, in 4D seismic feasibility studies, in seismic modeling for comparison with observed surveys and makes possible further implementations for incorporating the seismic data in assisted history matching.
The use of petroelastic attributes has several useful purposes1, such as feasibility of applying 4D seismic monitoring, optimize 4D seismic monitoring program and prepare more accurate production forecasts.
A possible workflow for applying 4D seismic in the monitoring of fluid flow in porous media follows the iterative steps2:
Steps 3 and 4 are unnecessarily cumbersome because most flow simulators do not calculate reservoir seismic attributes. As a result, information from the flow simulation Step 3 must be converted to a format suitable for analysis in the PEM Step 4.
In addition, errors may be introduced into the calculation of seismic attributes if fluid properties in the PEM do not match the corresponding fluid properties in the flow simulator, like using standard correlations of fluid properties.
These problems can be avoided if the PEM is incorporated into the flow simulator, eliminating the need of a third-party software to calculate the seismic attributes, so that it uses exactly the same fluid property model.
Fanchi1 shows the results for some reservoir management scenarios, applying successfully the petroelastic properties information calculated through an integrated flow simulator using the Gassmann's equation3, improving the reservoir management and monitoring processes.
Gosselin et al.4 also implemented an integrated flow simulator tool5 using the Gassmann's equation in a project to integrate 4D data into an assisted history matching process.
The ultimate aim of this project is to incorporate time-lapse seismic attributes into an assisted history match (AHM) tool, which combines efficient derivative calculation and robust optimization techniques, already developed in a previous project6 through an integrated reservoir flow simulator, facilitating a lot the viable use of this kind of data.
The use of seismic attributes has increased, especially when extracted from interpreted horizons. The various available attributes are not independent from each other but represent, in fact, different ways of presenting and studying fundamental information from seismic data (time, amplitude, frequency and attenuation). However, statistical analysis using attributes must be based on geological knowledge and not only on mathematical correlation. Petrophysical studies and seismic modeling are sources of understanding. Such knowledge is necessary to improve confidence in observed correlations with reservoir parameters and must be part of all attribute analysis.
However, the use of seismic attributes leads to several questions, for example, what do they all mean? When to use one or another? How to use them on geologic modeling? How reliable those data are? The answers to these questions are not easy, but considering about petrophysical modeling (Porosity, NTG and permeability) what is the best approach: to consider only well data, that are punctual and need to be interpolated, or try to find correlation with physical measurements (seismic data)? Not to consider seismic attributes makes one feel coming back in time, when this important tool was not available.
On a giant oilfield offshore Brazil seismic attributes (‘conventional', complex trace, polynomial decomposition, geometric and coherence) have been used to create geological models and to reduce uncertainties. The attribute choice must be performed by the geophysicist and the geologist working together, in order to check geological meaning of attribute maps, possible physical meaning of the attribute, etc. Plots of the highest correlation values should be visually inspected in order to choose the attribute with best correlation to the desired parameters.
The results show attributes have been favourable to porosity and NTG prediction, but regular (at maximum) to permeability. For permeability even if the results are not so good, the correlation are improving for the latest models (as long as new wells are used). Polynomial decomposion and complex trace attributes have shown better results.
Introduction: seismic attribute definitions and discussions
The use of seismic attribute data for prediction of detailed reservoir properties began more than 30 years ago.
In fact, a seismic attribute is any property derived from seismic reflection signal. Attributes may be compared to lithology in an attempt to devise a method of property prediction away from well control. The method of prediction can vary from a simple linear correlation to multi-attribute analysis, geostatistical methods, etc.
As an evidence of current proliferation the use of attributes, Chen and Sidney (1997) have catalogued more than 60 commom seismic attributes along with a description of their apparent significance and utility.
Although there is a rich history of seismic attributes use in reservoir prediction, the practice remains a difficult and uncertain task. The bulk of this uncertainty arises from the nature of the physics connecting a number of attributes to a corresponding reservoir property. Due to the complex and varied physical processes responsible for various attributes the unambiguous use of attributes for direct prediction will probably remain a challenge for the years to come.
In addition to the fact above described, there is the possibility of coming across statistical pitfalls while using multiple attributes for empirical reservoir property prediction. In addition, many attributes are derived using similar signal processing methods and can, in some cases, be considered largely redundant with respect to their description of the seismic signal.
The facies and net-to-gross ratio (NTG) modeling methodology presented in this paper has been successfully applied on geological characterization of a giant field at Campos Basin, offshore Brazil. Such methodology aimed to preserve the seismic attribute original trend for NTG property in the facies model. In this way, more reliable horizontal parameters could be determined. These are usually hard to obtain, due to the scarceness and huge spacing between wells in an oilfield at its initial development stages.
The technique may be described as follows:
In the modeling phase, mainly in the upscaling step, a particular interval named inter-stratified has been given special treatment. It shows a NTG value around 50% (against a 85% value for the field as a whole) and has been defined in order to preserve the critical reservoir heterogeneities. Once treated in a different way during the upscaling step, its characteristics can be preserved during the scale transfer, and consequently improving history match.
Facies model building may be constructed by two main methods: pixel based (Kriging, SGS) or object based. The choice for the best one depends upon the available information, the characteristics and the geological knowledge about the field. The input data for both is known as the hard data (e.g. wells and facies proportion), which must be respected during simulations, and the soft data (e.g. seismic attributes, variogram parameters), which are used as trends and have differentiated weights within simulations. As a result, each of these methods come up with distinct architecture and facies distribution. This paper intends to describe the phases in the facies and net-to-gross ratio (NTG) modeling (conditioned by maps obtained from seismic attributes) and its use in the upscaling phase in a giant field at Campos Basin. The methodology aims to embrace and preserve the seismic attribute trend for NTG within the facies model itself taking into account facies proportion maps and variograms used in geostatistical simulation.
The reservoir is made up of turbiditic sandstones and conglomerates. The main zone may be divided in two stratigraphic subzones: upper and lower. The lower subzone is composed of sandstones and conglomerates deposited in a turbidite channel system. The upper, deposited in a less confined setting, is composed of finer sandstones and intercalated shales. Previously both subzones were thought to be communicated, but after the pilot production started, a differential depletion has been observed between them, indicating that must indicate that the hydraulic vertical communication shows a certain degree of restriction. It has been attributed to an interval named inter-stratified (Inter) that presents a NTG around 50% (against a 85% value for the field as a whole) and located right on lower subzone. Although it is a meaningful heterogeneity such interval cannot be mapped by seismic and it is essential that its characteristics be preserved when the geological grid is upscaled into the simulation one. In order to keep its characteristics and to come to a better production history match the interval was individualizaed within the geological model and treated differently in the upscaling step.
Up to now, different petrophysical methodologies have been developed to improve the success rate in selecting oil intervals in the Gulf of San Jorge Basin oilfields. These methodologies have been successful only in a limited number of cases and a solution that has field-wide applicability has been lacking. This project attempts to optimize previous results using an integrated petrophysical characterization workflow.
The geological complexity of the Estancia Cholita Field which is mostly due to limited lateral continuity and small reservoir bed thickness, particularly in the Castillo Formation, makes layer-by-layer correlation difficult. Several other factors add to the complexity of petrophysical evaluations, these include: alteration of petrophysical properties in certain zones near faults, variation of tuffaceous material content, formation damage, invasion of drilling fluids, zones with viscous oil, low and variable formation water salinity.
During this study, a two step petrophysical evaluation workflow was developed. The first step considers the application of quick-look log analysis techniques, specially the "Ratio" methodology, and the free fluid and permeability indexes (NMR), with the aim to evaluate those intervals of major potential in the well. The following step consists of a detailed analysis, in the zones where water production forecast has been identified, to calculate a formation water resistivity profile along the well.
Preliminary results of the study confirm that combining this methodology with local experience, layer-by-layer correlations, formation pressures data, log analysis, and supplementary information helps improve predictions of oil bearing zones in the Estancia Cholita Field.