Layer | Fill | Outline |
---|
Map layers
Theme | Visible | Selectable | Appearance | Zoom Range (now: 0) |
---|
Fill | Stroke |
---|---|
Collaborating Authors
Results
Abstract The installation of intelligent wells to improve the economics of production is now common practice. These wells allow the access to marginal reservoirs, for which dedicated production might not be economic, and also accelerate the recovery. Sensors, flow-control and other devices can be used to manage the production from the commingled reservoirs and optimize the recovery. Traditional methods for production optimization and back-allocation of complex well configurations, such as nodal analysis, work only for a static problem. They cannot account for the dynamic changes that occur in time in the connected system of reservoirs and wellbore. Once multiphase flow occurs, both the change of the fluid mobility in the reservoir and the change of the choke performance cannot be correctly addressed. Moreover, the large number of uncertainties from reservoir to wellbore behavior that influence the performance of those advanced wells cannot be accurately dealt with using traditional approaches. A process is introduced that creates the most accurate well model of an intelligent completion accounting for all effects influencing the pressure behavior in the wellbore and in the reservoir. This model is used for optimization over all static and dynamic uncertainties to derive an interaction strategy with the intelligent well that maximizes oil production. Furthermore, the back-allocation algorithm is calibrated and trained on the proxy model of the well model. Introduction Intelligent wells are wells that have installed monitoring devices to record the production behavior in the completion, in the wellbore, and at the wellhead and allow conclusions on the inflow behavior from the reservoir. These monitoring devices are located either directly at the completion, at the wellhead or along the tubing string or in combination of all. These devices can measure the pressure, temperature and the flow rate. Although technically possible, the latter is rarely used at the completion level and rather situated at the well head. The monitoring devices deliver the measured data in real-time that allow the on-line analysis of the engineer. However, these sensors alone do not constitute an intelligent well. Control devices, which allow immediate (rig-less) interaction with the completions, production tubing or well head, are needed to react on the recorded events. Those devices can be in the simplest case an on-off tool to close in part or entirely a completion. More sophistication offers a surface-controlled flow control valve that can restrain the inflow from the completion into the production tubing by choking the fluid flow from between fully open to completely closed. The artificial lift system is also part of the control devices as, for example, the production rate can be reduced to avoid coning and establish production below the critical rate in the moment first signs of breakthrough appear. Commingled production from two or more productive horizons is the ideal method to accelerate production from a single well. Furthermore, marginal reservoirs, which are destined to be uneconomic with dedicated production, could become viable for production. The application of intelligent completions for such commingling wells allows not only the production and recovery optimization for each individual reservoir but also the value maximization of the well. Intelligent completions also can guarantee regulatory requirements to back-allocate production from the wellhead measurement to the individual reservoirs for reserves booking. An operator would not accept a predetermined strategy to optimize the production of a costly intelligent well that is based on an uncertain reservoir description.[1] He rather will proactively take advantage of the sensors installed in the well by learning from the production behavior, setting it into relation to the reservoir uncertainties and interacting with the well immediately for optimized production. One of the parameters that has to be known is the contribution from each completion to the well production, since any optimization technique has to base its calculation on it and try to improve the objective function, maximize oil production or maximize oil recovery for example. This paper suggests a workflow that processes the complex task of stochastic production optimization of intelligent completions over all parameters of influence and establishes the stochastic back-allocation algorithm. Both back-allocation and production optimization can be carried out in real-time.
Abstract Proxy models are becoming more widely used as they can simplify highly complex processes with reasonable accuracy. Especially in risk analysis, where complex relationships between the uncertainty parameters exist, proxy models are used in form of response surfaces to accelerate interpretation and optimization methods. However, the use of proxy models is rarely seen in production optimization. When the data gathering from wells and surface equipment is fully automated, production optimization can be performed almost in real-time. The bottleneck in this workflow is the high computational effort of simulation models and the large number of input variables to optimize. This disadvantage can be overcome by mimicking the behavior of the system, such as the coupling of a simulation model and the surface network model, by using a computational efficient method. The requirements for such proxy models are high, since they have to capture highly non-linear trends hidden in a small number of representative samples. This paper presents the usage of neural networks as proxy models. For the production optimization process, genetic algorithms are used. Their advantage lies in the ability to handle a large number of input variables. The neural network operates as fitness function for the genetic algorithm. The optimization result can be achieved extremely fast (within seconds), allowing optimization in near real time. A real life example is also presented in this work. Introduction Simulation models are usually used to asses the value and the opportunities in a field. Those models are derived from complex studies involving a high degree of uncertainty and a large amount of parameters that could be either independent or dependent on each other. The relation between the computed simulation results and the input data is generally highly nonlinear. The optimization of such Petroleum Production System, PPS, is therefore an extremely complex task that has many degrees of freedom requiring also many different numerical simulation models or realizations. Furthermore, the calculations from the reservoir to the surface installations involve a series of variable parameters additional to the reservoir parameters. Especially in large fields, the computation becomes either very time consuming or practically impossible to derive all possible solutions. Moreover, when regarding the uncertainty of the input parameters it is completely impractical to address each possible combination of all factors of influence. Proxy models, which are a simplified representation of the response surface of the numerical models, have been introduced to overcome this problem. This paper describes the possibility to create a proxy model with the help of Neural Networks. By using a limited number of simulation runs, Neural Networks are able to mimic the strongly non-linear behavior of a complex PPS constructing a proxy model that can be used for further investigations such as production optimization. Experimental Design It is imperative for sensitivity studies to consider as many influencing uncertainty parameter as possible but minimizing the need for simulation runs at the same time on the other hand. A series of strategies (Design of Experiment) has been proposed, which aim at maximizing the amount of information from a minimum number of runs. Reference 1 proposes a step-like strategy for such a sensitivity study (Figure 1): The first step involves a screening of the sensitivities of the parameters using the full factorial analysis in two levels. All possible combinations of the minimum and maximum occurring values of all parameters are used in the simulation and the most influencing parameters are identified. The full factorial analysis leads to 2n simulation runs, where n is the amount of uncertainty parameters that are investigated. However, to account for very often occurring nonlinearities, Reference 2 suggests to use a three level full factorial analysis (minimum, best estimate, maximum), which extends the number of simulation to 3n. By using the output of this analysis, a tornado chart is generated, from which the influence of the respective parameter on the overall result can be derived. The most influencing parameters, about three,[1] are used to continue the analysis in more detail. The simulation results are then regarded as the space of possible outcomes, which the response surface is constructed on.