|Theme||Visible||Selectable||Appearance||Zoom Range (now: 0)|
Scale Inhibitor Squeeze treatments are some of the most common techniques to prevent oilfield mineral scale deposition in oil producers. A squeeze treatment design's effectiveness and lifespan is determined by the scale inhibitor (SI) retention, which can be described using a pseudo-isotherm adsorption, commonly derived from coreflooding experiments, although in some certain circumstances a new isotherm will need to be re-derived to match the field return concentration profile, once the treatment is deployed and samples are collected to measure SI return concentration. This new isotherm is used to design the next treatment. The objective of this manuscript is to quantify the uncertainty, which depends of the number of samples analyzed. In any inverse problem, there might not be a unique solution, which is in our context a pseudo-isotherm matching the return concentration profile. As a consequence, there will be a certain level of uncertainty predicting the next squeeze treatment lifetime. Solving this inverse problem in Bayesian formulation, incorporating the prior information, and the likelihood involving the return concentration profile, it is possible to quantify the posterior distribution, and therefore calculate the uncertainty range, commonly known as P90/P50/P10, based on the Randomized Maximum Likelihood (RML) approach. The P90/P50/P10 was calculated as a function of the number of samples available, differentiating from the early production and late production.
The results suggest that there is a correlation between the P90/P50/P10 interval and the number of samples, i.e. the differences between the P10 and P90 in terms of the forecast squeeze lifetime was wider the smaller number of samples. The methodology proposed may be used to determine the number of samples required to reduce the level of uncertainty predicting the lifetime of the next squeeze treatment. Although taking more samples may increase the cost per barrel for a treatment, the ability to predict accurately treatment lifetime will be more cost effective in the long term, as production might not be affected.
Santos, Letícia Siqueira dos (UNICAMP) | Santos, Susana Margarida da Graça (UNICAMP) | Santos, Antonio Alberto de Souza dos (UNICAMP) | Schiozer, Denis José (UNICAMP) | Silva, Luis Otávio Mendes da (UNICAMP)
The Expected Value of Information (EVoI) is a criterion to analyze the feasibility of acquiring new information to deal with uncertainties and improve decisions at any stage of an oil field. Here, we evaluate the influence of the use of representative models (RM) on the EVoI estimation and on the decision to develop the petroleum field. These RM are used to represent a large set of models that honor production data (FM), considering uncertainties in reservoir, fluid and economic parameters, enabling the following processes: (1) optimize production strategies (specialized for each RM and robust to all RM), (2) risk analysis, (3) select the strategy to develop the field based on risk analysis, and (4) estimate the EVoI. We evaluated the influence of the number of RM on these processes, observing the impacts on the results of reducing computational costs. For the EVoI, we applied a Complete (EVoI_FM) and a Simplified (EVoI_RM) methodology, where EVoI_FM was evaluated with all models (FM) while EVoI_RM used different groups with different numbers of RM (GR1, GR2 and GR3, ranging from 9 to 150 models in each group). To assess the quality of the results, we used the complete estimate (EVoI_FM) as a reference. The study was conducted on UNISIM-I-D, a benchmark oil reservoir in the development phase, taking an appraisal well as a source of information to clarify a structural uncertainty. Using RM to optimize specialized production strategies proved useful, since optimizing strategies for all FM would require high computational costs. Moreover, the RM could be used to represent risk curves and select production strategies under uncertainty, but less precisely, affecting directly the results of the EVoI (which is the difference between the expected values of the two curves). The precision of EVoI_RM results varied according to the number and group of RM employed, also varying the best strategies selected for field development. The choice of using simplifications or not will depend on the accuracy required or available resources. Variations in EVoI_RM may be tolerable when compared to the time saved, being the decision maker free for choosing the best estimation method.
Various physico-chemical processes are affecting Alkali Polymer (AP) Flooding. Core floods can be performed to determine ranges for the parameters used in numerical models describing these processes. Because the parameters are uncertain, prior parameter ranges are introduced and the data is conditioned to observed data. It is challenging to determine posterior distributions of the various parameters as they need to be consistent with the different sets of data that are observed (e.g. pressures, oil and water production, chemical concentration at the outlet).
Here, we are applying Machine Learning in a Bayesian Framework to condition parameter ranges to a multitude of observed data.
To generate the response of the parameters, we used a numerical model and applied Latin Hypercube Sampling (2000 simulation runs) from the prior parameter ranges.
To ensure that sufficient parameter combinations of the model comply with various observed data, Machine Learning can be applied. After defining multiple Objective Functions (OF) covering the different observed data (here six different Objective Functions), we used the Random Forest algorithm to generate statistical models for each of the Objective Functions.
Next, parameter combinations which lead to results that are outside of the acceptance limit of the first Objective Function are rejected. Then, resampling is performed and the next Objective Function is applied until the last Objective Function is reached. To account for parameter interactions, the resulting parameter distributions are tested for the limits of all the Objective Functions.
The results show that posterior parameter distributions can be efficiently conditioned to the various sets of observed data. Insensitive parameter ranges are not modified as they are not influenced by the information from the observed data. This is crucial as insensitive parameters in history could become sensitive in the forecast if the production mechanism is changed.
The workflow introduced here can be applied for conditioning parameter ranges of field (re-)development projects to various observed data as well.
Objectives/Scope: A stable, single-well deconvolution algorithm has been introduced for well test analysis in the early 2000's, that allows to obtain information about the reservoir system not always available from individual flow periods, for example the presence of heterogeneities and boundaries. One issue, recognised but largely ignored, is that of uncertainty in well test analysis results and non-uniqueness of the interpretation model. In a previous paper (SPE 164870), we assessed these with a Monte Carlo approach, where multiple deconvolutions were performed over the ranges of expected uncertainties affecting the data (Monte Carlo deconvolution). Methods, Procedures, Process: In this paper, we use a nonlinear Bayesian regression model based on models of reservoir behaviour in order to make inferences about the interpretation model. This allows us to include uncertainty for the measurements which are usually contaminated with large observational errors. We combine the likelihood with flexible probability distributions for the inputs (priors), and we use Markov Chain Monte Carlo algorithms in order to approximate the probability distribution of the result (posterior). Results, Observations, Conclusions: We validate and illustrate the use of the algorithm by applying it to the same synthetic and field data sets as in SPE 164870, using a variety of tools to summarise and visualise the posterior distribution, and to carry out model selection. Novel/Additive Information: The approach used in this paper has several advantages over Monte Carlo deconvolution: (1) it gives access to meaningful system parameters associated with the flow behaviour in the reservoir; (2) it makes it possible to incorporate prior knowledge in order to exclude nonphysical results; and (3) it allows to quantify parameter uncertainty in a principled way by exploiting the advantages of the Bayesian approach.
History matching field performance is a time-consuming, complex and non-unique inverse problem that yields multiple plausible solutions. This is due to the inherent uncertainty associated with geological and flow modeling. The history matching must be performed diligently with the ultimate objective of availing reliable prediction tools for managing the oil and gas assets. Our work capitalizes on the latest development in ensemble Kalman techniques, namely, the Ensemble Kalman Filter and Smoother (EnKF/S) to properly quantify and manage reservoir models’ uncertainty throughout the process of model calibration and history matching.
Sequential and iterative EnKF/S algorithms have been developed to overcome the shortcomings of the existing methods such as the lack of data assimilation capabilities and abilities to quantify and manage uncertainties, in addition to the huge number of simulations runs required to complete a study. An initial ensemble of 40 to 50 equally probable reservoir models was generated with variable areal, vertical permeability and porosity. The initial ensemble captured the most influencing reservoir properties, which will be propagated and honored by the subsequent ensemble iterations. Data misfits between the field historical data and simulation data are calculated for each of the realizations of reservoir models to quantify the impact of reservoir uncertainty, and to perform the necessary changes on horizontal, vertical permeability and porosity values for the next iteration. Each generation of the optimization process reduces the data misfit compared to the previous iteration. The process continues until a satisfactory field level and well level history match is reached or when there is no more improvement.
In this study, an application of EnKF/S is demonstrated for history matching of a faulted reservoir model under waterflooding conditions. The different implementations of EnKF/S were compared. EnKF/S preserved key geological features of the reservoir model throughout the history matching process. During this study, EnKF/S served as a bridge between classical control theory solutions and Bayesian probabilistic solutions of sequential inverse problems. EnKF/S methods demonstrated good tracking qualities while giving some estimate of uncertainty as well.
The updated reservoir properties (horizontal, vertical permeability and porosity values) are conditioned throughout the EnKF/S processes (cycles), maintaining consistency with the initial geological understanding. The workflow resulted in enhanced history match quality in shorter turnaround time with much fewer simulation runs than the traditional genetic or Evolutionary algorithms. The geological realism of the model is retained for robust prediction and development planning.
Ranaee, Ehsan (Department of Energy, Politecnico di Milano) | Guédon, Gaël Raymond (Department of Energy, Politecnico di Milano) | Moghadasi, Leili (Eni SpA) | Inzoli, Fabio (Department of Energy, Politecnico di Milano) | Riva, Monica (Department of Civil and Environmental Engineering, Politecnico di Milano) | Maddinelli, Giuseppe (Eni SpA) | Bartosek, Martin (Eni SpA) | Guadagnini, Alberto (Department of Civil and Environmental Engineering, Politecnico di Milano)
We aim at developing a viable workflow for the characterization of reservoir responses under Water Alternating Gas (WAG) conditions for enhanced oil recovery. We do so through a numerical Monte Carlo (MC) framework and by relying on (
We consider uncertainty in (
In the case of a homogeneous reservoir, we note that reservoir simulation responses are strongly sensitive to the degree of convexity of the two-phase relative permeability curves. In the case of heterogeneous reservoir settings, results are almost similarly sensitive to porosity, characteristics of the relative permeability model, and the degree of heterogeneity of the reservoir. In the case of well-connected (randomly) heterogeneous fields, the importance of the porosity is stronger than in the heterogeneous setting lacking well connected regions.
Characterization of reservoir model attributes relying on pore-scale simulation approaches in the presence of uncertainty can provide a robust term of comparison which can be integrated within a classical reservoir simulation approach relying on relative permeability data stemming from core-flooding experiments. Our results document that uncertainties in the evaluation of (
Hydrocarbon field (re-)development projects require the evaluation of a large number of development options under uncertainty. Furthermore, information of data gathering programs might result in narrowing parameter ranges and change the choice of the preferred development option.
The large number of development options (and decisions accordingly) which have to be taken under uncertainty leads to the necessity to determine the impact of the decisions on the (re-)development project objective. Knowing the sensitivity of the decisions on the project objective (e.g. NPV) allows for resource and data acquisition planning. The impact of decisions on the project value can be determined by performing a Generalized Sensitivity Analysis. This analysis does not replace Value of Information but facilitates planning and allows focusing on important decisions.
To further improve Decision Analysis and focus on important parameters, a Generalized Sensitivity Analysis of uncertain parameters on decisions can be performed. The advantage of such an investigation over sensitivity analysis on Oil Originally in Place (OOIP) or Net Present Value (NPV) is that it includes parameter interactions. Furthermore, it covers the impact of a parameter on the decisions directly rather than indirectly when OOIP or NPV sensitivity is used.
The analysis is shown at an example project in a Decision Analysis framework. The use of decision impact evaluation and parameter assessment on decisions might lead to more focused and faster hydrocarbon field (re-)development project execution.
Oil and gas production applications demand high reliability, which implies low failure probability over the lifetime of the components. The service environments encompass high pressure high temperature (HPHT) conditions inside and seawater with cathodic protection (CP) systems on the outside. New material combinations are being employed, including thermo-mechanically processed low alloy steels and corrosion resistant alloys (CRA). Additively manufactured (AM) alloys are also entering the market, creating further variability and directionality in properties. Material qualification involves two types of accelerated, laboratory tests - standard and fit-for-purpose. However, the acceleration vectors used in these types of tests are usually not clearly identified. Relating the materials qualification tests to service performance requires a systematic approach that considers the uncertainties in the acceleration vectors such as, loading conditions, environmental compositions, operating parameters, and material microstructures. In this paper, we describe the acceleration vectors typically used in qualification tests and a risk framework that may be used to relate the test results to long term service performance. The benefit of a risk informed approach is not only avoidance of unnecessary testing, but also investing the testing resources to attain highest reliability.
Geophysical seismic surveys have been applied to marine geo-site characterization to create images of the complex geological conditions under the seafloor. Accurate knowledge of the ground conditions is critical for geo-risk assessment purposes such as mapping shallow gas hydrate deposits, over-pressured zones, or geological anomalies. Traditional seismic reflection profiling is a relatively fast and flexible method of processing seismic data to recover information on the spatial variation in facies boundaries and subsurface structure. However, the method usually does not provide quantitative information on the composition of the sediments and their physical properties. Seismic inversion is a method to convert the wave signals from time-to space-domain and derive specific material properties by using iterative numerical modeling. In this paper, we introduce a probabilistic seismic inversion scheme to recover the vertical profiles of the shallow soil bulk density from marine seismic survey data. This acoustic impedance inversion is based on the geophysical seismic convolution method and the reversible jump Markov chain Monte Carlo (rj-MCMC) method. The rj-MCMC is a recently developed stochastic sampling technique that allows the modeling to free the number of layers under the seafloor. Hence the number of soil units is estimated from the data in an objective manner. We applied this new approach to a single trace of post-stack seismic data, collected from the Hydrate Ridge area on the west coast of Oregon. Since the purpose of this shallow seismic inversion is to support the design of the offshore foundation, we focused on the relatively short length of seismic signals near the seafloor. The inverted results of the bulk densities along the depth compare well with the field measurements performed at the nearby drilled borehole. This study introduces an advantage of the probabilistic seismic inversion approach to support shallow marine site characterization from low-frequency data, and we discuss the benefits of this new approach on geotechnical site characterization.
Beck, Andre T. (University of São Paulo) | Cecílio, Diogo L. (Federal University of Rio Grande do Sul) | Gomes, Wellison J. S. (Federal University of Santa Catarina) | Tessari, Rodolfo (Technical Federal University of Paraná) | Jacinto, Carlos M. C. (Petrobras)
Summary The development of exploratory wells is based on the interpretation of seismic readings by geotechnical field experts. Correlation wells, when they exist, are located very far, leading to large uncertainties and to different interpretations by different experts. Exploratory wells are expensive and risky endeavors, in particular deep wells with high-temperature and high-pressure conditions. Reaching reservoirs at record depths often requires challenging usual safety margins, which should be based on reliability analyses. In this paper, we present an empirical random kick model to combine the opinions of different specialists and the reliability analysis of casing subject to kick loads. The empirical kick model has a random kick volume and intensity, gas gravity, and influx depth. Three tubular strength models are considered: Barlow formula, von Mises stress, and Klever-Stewart model (Klever and Stewart 1998; Stewart and Klever 1998), with random yield stress and tube thickness. A typical 19,000 ft well is considered as a case study example, and the First-Order Reliability Method (FORM) is used as a probabilistic solver. Results show the differences between safety factors and reliability indices obtained for the three strength models. Sensitivity factors show that yield stress and model error are the most relevant random variable at shallow depths. Kick volume and influx depth are found to be the most relevant random variables of the kick model, especially at greater depths. The proposed empirical kick model is shown to have a significant impact on casing reliability, in comparison with a kick in which uncertain kick variables are assumed with uniform distribution.