Luo, Xiaodong (International Research Institute of Stavanger) | Lorentzen, Rolf J. (International Research Institute of Stavanger) | Valestrand, Randi (International Research Institute of Stavanger) | Evensen, Geir (International Research Institute of Stavanger and Nansen Environmental and Remote Sensing Center)
Ensemble-based methods are among the state-of-the-art history-matching algorithms. However, in practice, they often suffer from ensemble collapse, a phenomenon that deteriorates history-matching performance. It is customary to equip an ensemble history-matching algorithm with a localization scheme to prevent ensemble collapse. Conventional localization methods use distances between the physical locations of model variables and observations to modify the degree of the observations’ influence on model updates. Distance-based localization methods work well in many problems, but they also suffer from dependence on the physical locations of both model variables and observations, the challenges in dealing with nonlocal and time-lapse measurements, and the nonadaptivity to handling different types of model variables.
To enhance the applicability of localization to various history-matching problems, we adopt an adaptive localization scheme that exploits the correlations between model variables and simulated observations. We elaborate how correlation-based adaptive localization can overcome or mitigate issues arising in conventional distance-based localization.
To demonstrate the efficacy of correlation-based adaptive localization, we adopt an iterative ensemble smoother (iES) with the proposed localization scheme to history match the real production data of the Norne Field model, and we compare the history-matching results with those obtained by using the iES with distance-based localization. Our study indicates that when compared with distance-based localization, correlation-based localization not only achieves close or better performance in terms of data mismatch, but also is more convenient to use in practical history-matching problems. As a result, the proposed correlation-based localization scheme might serve as a viable alternative to conventional distance-based localization.
Ensemble-based methods are among the state-of-the-art history matching algorithms. In practice, they often suffer from ensemble collapse, a phenomenon that deteriorates history matching performance. To prevent ensemble collapse, it is customary to equip an ensemble history matching algorithm with a certain localization scheme. Conventional localization methods use distances between physical locations of model variables and observations to modify the degree of observations' influence on model updates. Distance- based localization methods work well in many problems, but they also suffer from some long-standing issues, including, for instance, the dependence on the presence of physical locations of both model variables and observations, the challenges in dealing with nonlocal and time-lapse observations, and the non-adaptivity to handle different types of model variables. To enhance the applicability of localization to various history matching problems, we propose to adopt an adaptive localization scheme that exploits the correlations between model variables and observations for localization. We elaborate how correlation-based adaptive localization can mitigate or overcome the noticed issues arising in conventional distance-based localization.
To demonstrate the efficacy of correlation-based adaptive localization, we apply it to history-match the real production data of the full Norne field model using an iterative ensemble smoother (iES), and compare the history matching results to those obtained by using the same iES but with distance-based localization. Our study indicates that, in comparison to distance-based localization, correlation- based localization not only achieves close or better performance in terms of data mismatch, but also is more convenient to implement and use in practical history matching problems. As a result, the proposed correlation-based localization scheme may serve as a viable alternative to conventional distance-based localization.
Statoil is implementing “Fast Model Update (FMU),” an integrated and automated workflow for reservoir modeling and characterization. FMU connects all steps and disciplines from seismic depth conversion to prediction and reservoir management taking into account relevant reservoir uncertainty. FMU delivers an ensemble of geologically-consistent and history-matched model realizations that together characterizes the reservoir uncertainty. FMU facilitates management of the reservoir (e.g. field development plan, well planning, and drainage strategies) in both early-phase as well as mature projects, properly taking geological uncertainty (including structure, facies, and rock/fluid properties) into account. The focus of the paper is to demonstrate how FMU is used in an algorithm for robust optimization of wells (e.g. well targets, infill wells, drilling priority, rate control). In the current paper we demonstrate some early results where the drilling sequence of wells is optimized under geological uncertainty, using an ensemble of models conditioned on all available data. The final product provides an optimized drilling schedule and drilling time. The paper includes examples from one synthetic and one real field application.
This paper compares two ensemble-based data-assimilation methods when solving the history-matching problem in reservoirsimulation models. The methods are the Ensemble Kalman Filter (EnKF) and the Ensemble Smoother (ES). EnKF has been used extensively in petroleum applications while ES is now used for the first time for history matching. ES differs from EnKF by computing a global update in the space-time domain, rather than using recursive updates in time as in EnKF. Thus, the sequential updating of the realizations with associated restarts is avoided.
EnKF and ES provide identical solutions for linear dynamical models. However, for nonlinear dynamical models, and in particular models with chaotic dynamics, EnKF is superior to ES, due to the fact that the recursive updates keep the model on track and close to the true solution. Thus, ES is not much used and EnKF has been the method of choice in most data assimilation studies where ensemble methods are used.
On the other hand, reservoir simulation models are rather diffusive systems when compared to the chaotic dynamical models that were previously used to test ES. If we can assume that the model solution is stable with respect to small perturbations in the initial conditions and the history-matching parameters, then ES should give similar results to EnKF, and ES may be a more efficient and much simpler method to implement and apply.
In this paper we compare EnKF and ES and show that ES indeed provide for an efficient ensemble-based method for history matching.
Although typically large uncertainties are associated with reservoir structure, the reservoir geometry is usually fixed to a single interpretation in history-matching workflows, and focus is on the estimation of geological properties such as facies location, porosity, and permeability fields. Structural uncertainties can have significant effects on the bulk reservoir volume, well planning, and predictions of future production.
In this paper, we consider an integrated reservoir-characterization workflow for structural-uncertainty assessment and continuous updating of the structural reservoir model by assimilation of production data. We address some of the challenges linked to structural-surface updating with the ensemble Kalman filter (EnKF).
An ensemble of reservoir models, expressing explicitly the uncertainty resulting from seismic interpretation and time-to-depth conversion, is created. The top and bottom reservoir-horizon uncertainties are considered as a parameter for assisted history matching and are updated by sequential assimilation of production data using the EnKF. To avoid modifications in the grid architecture and thus to ensure a fixed dimension of the state vector, an elastic-grid approach is proposed. The geometry of a base-case simulation grid is deformed to match the realizations of the top and bottom reservoir horizons.
The method is applied to a synthetic example, and promising results are obtained. The result is an ensemble of history-matched structural models with reduced and quantified uncertainty. The updated ensemble of structures provides a more reliable characterization of the reservoir architecture and a better estimate of the field oil in place.
An integrated reservoir characterization workflow, for structural uncertainty assessment and continuous updating of the structural reservoir model, by assimilation of production data, is presented. An ensemble of reservoir models, expressing explicitly the uncertainty resulting from seismic interpretation and time-to-depth conversion, is created. The top and bottom reservoir-horizon uncertainties are considered as a parameter for assisted history matching and are updated by sequential assimilation of production data using the Ensemble Kalman Filter (EnKF). To avoid modifications in the grid architecture and thus ensure a fixed dimension of the state vector, en elastic grid approach is proposed. The geometry of a base-case simulation grid is deformed to match the reservoir top and bottom horizon realizations. The method is applied to a synthetic example. The result is an ensemble of history-matched structural models with reduced and quantified uncertainty. The updated ensemble of structures provides a more reliable characterization of the reservoir architecture and a better estimate of the field oil in place.
This paper demonstrates the potential and advantages of the Ensemble Kalman filter (EnKF) as a tool for assisted history matching, based on its sequential processing of measurements, its capability of handling large parameter sets, and on the fact that it solves the combined state and parameter estimation problem.
A method and a thorough workflow for updating reservoir simulation models using the EnKF is developed. In addition, we present a method for updating relative permeability curves, as well as an improved approach for updating fault transmissibility multipliers.
The proposed workflow has been applied on a complex North Sea oil field. The EnKF successfully provides an ensemble of history matching reservoir models. A significant improvement in the history match is obtained by updating the relative permeability properties in addition to porosity and permeability fields and initial fluid contacts. Fault multipliers are estimated, and it is shown how the use of transformations, which handles non-Gaussian model variables, makes it possible to determine if a fault is open, closed, or partially closed with respect to flow.
The presented method is an innovative contribution to reservoir management workflows, which show growing interest in real time applications and fast model updating. Sequential data assimilation provides an updated reservoir model conditioned on the most recent production data. The updated ensemble is used to predict the uncertainty in future production and it is demonstrated that the EnKF leads to improved predictions with reduced uncertainty.
This paper applies the ensemble Kalman filter (EnKF) to history match a North Sea field model. This is, as far as we know, one of the first published studies in which the EnKF is applied in a realistic setting using real production data. The reservoir-simulation model has approximately 45,000 active grid cells, and 5 years of production data are assimilated. The estimated parameters consist of the permeability and porosity fields, and the results are compared with a model previously established using a manual history-matching procedure. It was found that the EnKF estimate improved the match to the production data. This study, therefore, supported previous findings when using synthetic models that the EnKF may provide a useful tool for history matching reservoir parameters such as the permeability and porosity fields.
The EnKF developed by Evensen (1994, 2003, 2007) is a statistical method suitable for data assimilation in large-scale nonlinear models. It is a Monte Carlo method, where model uncertainty is represented by an ensemble of realizations. The prediction of the estimate and uncertainty is performed by ensemble integration using the reservoir-simulation model. The method provides error estimates at any time based on information from the ensemble. When production data are available, a variance-minimizing scheme is used to update the realizations. The EnKF provides a general and model-independent formulation and can be used to improve the estimates of both the parameters and variables in the model. The method has previously been applied in a number of applications [e.g., in dynamical ocean models (Haugen and Evensen 2002), in model systems describing the ocean ecosystems (Natvik and Evensen 2003a, 2003b), and in applications within meteorology (Houtekamer et al. 2005)]. This shows that the EnKF is capable of handling different types of complex- and nonlinear-model systems.
The method was first introduced into the petroleum industry in studies related to well-flow modeling (Lorentzen et al. 2001, 2003). Nævdal et al. (2002) used the EnKF in a reservoir application to estimate model permeability focusing on a near-well reservoir model. They showed that there could be a great benefit from using the EnKF to improve the model through parameter estimation, and that this could lead to improved predictions. Nævdal et al. (2005) showed promising results estimating the permeability as a continuous field variable in a 2D field-like example. Gu and Oliver (2005) examined the EnKF for combined parameter and state estimation in a standardized reservoir test case. Gao et al. (2006) compared the EnKF with the randomized-maximum-likelihood method and pointed out several similarities between the methods. Liu and Oliver (2005a, 2005b) examined the EnKF for facies estimation in a reservoir-simulation model. This is a highly nonlinear problem where the probability-density function for the petrophysical properties becomes multimodal, and it is not clear how the EnKF can best handle this. A method was proposed in which the facies distribution for each ensemble member is represented by two normal distributed Gaussian fields using a method called truncated pluri-Gaussian simulation (Lantuéjoul 2002). Wen and Chen (2006) provided another discussion on the EnKF for estimation of the permeability field in a 2D reservoir-simulation model and examined the effect of the ensemble size. Lorentzen et al. (2005) focused on the sensitivity of the results with respect to the choice of initial ensemble using the PUNQ-S3. Skjervheim et al. (2007) used the EnKF to assimilate seismic 4D data. It was shown that the EnKF can handle these large data sets and that a positive impact could be found despite the high noise level in the data.
The EnKF has some important advantages when compared to traditional assisted history-matching methods; the result is an ensemble of history-matched models that are all possible model realizations. The data are processed sequentially in time, meaning that new data are easily accounted for when they arrive. The method allows for simultaneous estimation of a huge number of poorly known parameters such as fields of properties defined in each grid cell.
By analyzing the EnKF update equations, it is seen that the actual degrees of freedom in the estimation problem are limited equal to the ensemble size. One is still able to update the most important features of large-scale models. A limitation of the EnKF is the fact that its computations are based on first- and second-order moments, and there are problems that are difficult to handle, particularly when the probability distributions are multimodal (e.g., when representing a bimodal channel facies distribution).
This paper considers the use of the EnKF for estimating dynamic and static parameters, focusing on permeability and porosity, in a field model of a StatoilHydro-operated field in the North Sea. The largest uncertainty in the model is expected to be related to the permeability values, especially in the upper part of the reservoir where the uncertainty may be as large as 30%.
A method based on the ensemble Kalman filter (EnKF) for continuous model updating with respect to the combination of production data and 4D seismic data is presented. When the seismic data are given as a difference between two surveys, a combination of the ensemble Kalman filter and the ensemble Kalman smoother has to be applied. Also, special care has to be taken because of the large amount of data assimilated. Still, the method is completely recursive, with little additional cost compared to the traditional EnKF. The model system consists of a commercial reservoir simulator coupled with a rock physics and seismic modeling software. Both static variables (porosity, permeability, and rock physic parameters) and dynamic variables (saturations and pressures) may be updated continuously with time based on the information contained in the assimilated measurements. The method is applied to a synthetic model and a real field case from the North Sea. In both cases, the 4D seismic data are different variations of inverted seismic. For the synthetic case, it is shown that the introduction of seismic data gives a much better estimate of reservoir permeability. For the field case, the introduction of seismic data gives a very different permeability field than using only production data, while retaining the production match.
The Kalman filter was originally developed to update the states of linear systems (Kalman 1960). For a presentation of this method in a probabilistic, linear least-squares setting, see Tarantola (2005). However, this method is not suitable for nonlinear models, and the ensemble Kalman filter (EnKF) method was introduced in 1994 by Geir Evensen for updating nonlinear ocean models (Evensen 1994). The method may also be applied to a combined state and parameter estimation problem (Evensen 2006; Lorentzen 2001; Anderson 1998). Several recent investigations have shown the potential of the EnKF for continuous updating of reservoir simulation models, as an alternative to traditional history matching (Nævdal et al. 2002a, b; Nævdal et al. 2005; Gu and Oliver 2004; Gao and Reynolds 2005; Wen and Chen 2005). The EnKF method is a Monte Carlo type sequential Bayesian inversion, and provides an approximate solution to the combined parameter and state-estimation problem. The result is an ensemble of solutions approximating the posterior probability density function for the model input parameters (e.g., permeability and porosity), state variables (pressures and saturations), and other output data (e.g., well production history) conditioned to measured, dynamic data.
Conditioning reservoir simulation models to seismic data is a difficult task (Gosselin et al. 2003). In this paper, we show how the ensemble Kalman filter method can be used to update a combined reservoir simulation/seismic model using the combination of production data and inverted 4D seismic data. There are special challenges involved in the assimilation of the large amount of data available with 4D seismic, and the present work is based on the work presented by Evensen (2006, 2004) and Evensen and van Leeuwen (2000). In the following, the combined state and parameter estimation problem is described in a Bayesian framework, and it is shown how this problem is solved using the EnKF method, with emphasis on the application to 4D seismic data. When the seismic data are given as a difference between two surveys, a combination of the ensemble Kalman filter and the ensemble Kalman smoother has to be applied. Special challenges involved when the amount of data is very large are discussed. The validity of the method is examined using a synthetic model, and finally, a real case from the North Sea is presented.
The ensemble Kalman filter (EnKF) has been used for history matching a simulation model of a North Sea reservoir. Parameters such as initial fluid contacts, vertical transmissivity multipliers and fault transmissivity multipliers have been estimated as well as 3D fields of porosity and permeability.
It is shown that for several of the parameters a large initial uncertainty is reduced to an acceptable level by the assimilation of well-log measurements and production rates of oil, gas and water. The result is an ensemble of history matched realizations which can be used to predict the uncertainty in future production.
It is also shown that the formulation used in the EnKF reduces a nonlinear minimization problem in a huge parameter space, involving the minimization of an objective function with multiple local minima, to a statistical minimization problem in the ensemble space. Thus, by searching for the mean rather than the mode of the posterior pdf, the method avoids getting trapped in local minima and is thus promising for history matching reservoir simulation models.
Furthermore, the EnKF provides an ideal setting for operational reservoir monitoring and prediction, including proper representation and prediction of uncertainty.
Recently, there has been a growing interest in more mathematical and statistical methods for history matching. These involve both brute force direct minimization techniques and gradient methods based on the use of adjoints. Common for these is that they have all considered a pure parameter estimation problem. This differs from the combined parameter and state estimation problem which is
considered when using the Ensemble Kalman Filter (EnKF) introduced by Evensen (1994, 2006). The EnKF has recently been taken into use with simulation models for oil and gas reservoirs, with the purpose of estimating poorly known parameters and to improve the predictive capability of the models.
Traditional methods for assisted history matching minimize a cost function which measures the difference between simulated and observed production rates. The methods use the following loop: (1) The flow simulator is run for the complete production period; (2) the cost function is evaluated based on the difference between historical and simulated production rates; (3) the static parameters are
updated, and the and the simulator is rerun. These methods solve a so called strong constraint formulation where the model errors are assumed to be accounted for by the set of parameters included in the cost function. The search for the solution is conducted in a space with dimension equal to the number of parameters and the problem becomes highly nonlinear leading to a cost function which typically will contain many local minima. This effectively limits the number of parameters which can be included in the optimization. Consequently the parameterization used becomes critical and the major uncertainty in the model must be represented by as few parameters as possible (Evensen, 2006).