The need to develop new tools that allow reservoir engineers to optimize reservoir performance is becoming more demanding by the day. One of the most challenging and influential problems facing reservoir engineers is well placement optimization.
The North Kuwait field (NKF) consists of six fields containing four naturally fractured carbonate formations. The reservoirs are composed of relatively tight limestone and dolomite embedded with anhydrate and shale. The fields are divided into isolated compartments based on fault zones and supported by a combination of different fluid compositions, initial pressures, and estimated free-water levels. Due to natural complexity, tightness, and high drilling costs of wells in the NKF, it is very important to identify the sweet spots and the optimum well locations.
This paper presents two intelligent methods that use dynamic numerical simulation model results and static reservoir properties to identify zones with a high-production potential: reservoir opportunity index (ROI) and simulation opportunity index (SOI). The Petrel* E&P software platform was chosen as the integrated platform to implement the workflow. The fit-for-purpose time dependent 2D maps generated by the Petrel platform facilitated the decision-making process used for locating new wells in the dominant flow system and provided immense support for field-development plans.
The difference between the two methods is insignificant because of reservoir tightness, limited interference, and natural uncertainty on compartmentalization. At this stage, pressure is not a key parameter. As a result, unlike brown fields, less weight was given to simulated pressure, and SOI was used to select the well locations.
The results of this study show that implementing these workflows and obtaining the resulting maps significantly improve the selection process to identify the most productive areas and layers in a field. Also, the optimum numbers of wells using this method obtained in less time and with fewer resources are compared with results using traditional industry approaches.
The global economy continues its journey of evolution and progression driven by industrialism as its primary force. With such a fast pace of development and recovery from several recessions over a number of years, dependency on energy sources became inevitable to satisfy the rising demand. This paper represents a proposed global energy price model that has the flexibility of modeling the energy price, using data from specific regions of the world, as well as the global energy pricing equation. The ANM (Alternate Novel Model) is presented here.
The model focuses mainly on oil price modeling, since oil accounts for more than 84% of the current world energy supply. The model duration is 50 years; starting from 1980 to 2030, model matching period from 1980 to 2011, and the prediction period is from 2012to 2030.
The modeling approach used in ANM adopts weighted averaging of individual factors and it relies on line regression technique. Therefore, future trends are being predicted based on the cyclic nature of the market and historical data "the future is reflection of the past??. ANM can then preduct the future oil prices, depending on the factors and variables that have been placed in the process for the output results.
The paper aims to propose a reliable model that accounts for most governing factors in the global energy pricing equation. All steps followed and assumptions made will be discussed in detailto clarify the working mechanism for this model and pave the road for any future modifications.
Haider, Bader Y.A. (Kuwait Oil Company) | Rachapudi, Rama Rao Venkata Subba (Kuwait Oil Company) | Al-Yahya, Mohammad (Kuwait Oil Company) | Al-Mutairi, Talal (Kuwait Oil Company) | Al Deyain, Khaled Waleed (Kuwait Oil Company)
Production from Artificially lifted (ESP) well depends on the performance of ESP and reservoir inflow. Realtime monitoring of ESP performance and reservoir productivity is essential for production optimization and this in turn will help in improving the ESP run life. Realtime Workflow was developed to track the ESP performance and well productivity using Realtime ESP sensor data. This workflow was automated by using real time data server and results were made available through Desk top application.
Realtime ESP performance information was used in regular well reviews to identify the problems with ESP performance, to investigate the opportunity for increasing the production. Further ESP real time data combined with well model analysis was used in addressing well problems.
This paper describes about the workflow design, automation and real field case implementation of optimization decisions. Ultimately, this workflow helped in extending the ESP run life and created a well performance monitoring system that eliminated the manual maintenance of the data .In Future, this workflow will be part of full field Digital oil field implementation.
Determining the optimum location of wells during waterflooding contributes significantly to efficient reservoir management. Often, Voidage Replacement Ratio (VRR) and Net Present Value (NPV) are used as indicators of performance of waterflood projects. In addition, VRR is used by regulatory and environmental agencies as a means of monitoring the impact of field development activities on the environment while NPV is used by investors as a measure of profitability of oil and gas projects. Over the years, well placement optimization has been done mainly to increase the NPV. However, regulatory measures call for operators to maintain a VRR of one (or close to one) during waterflooding.
A multiobjective approach incorporating NPV and VRR is proposed for solving the well placement optimization problem. We present the use of both NPV and VRR as objective functions in the determination of optimal location of wells. The combination of these two in a multiobjective optimization framework proves to be useful in identifying the trade-offs between the quest for high profitability of investment in oil and gas projects and the desire to satisfy regulatory and environmental requirements. We conducted the search for optimum well locations in three phases. In the first phase, only the NPV was used as the objective function. The second phase has the VRR as the sole objective function. In the third phase, the objective function was a weighted sum of the NPV and the VRR. A set of four weights were used in the third phase to describe the relative importance of the NPV and the VRR and a comparison of how these weights affect the optimized NPV and VRR values is provided.
We applied the method to determine the optimum placement of wells using two sample reservoirs: one with a distributed permeability field and the other, a channel reservoir with four facies. Two evolutionary-type algorithms: the covariance matrix adaptation evolutionary strategy (CMA-ES) and differential evolution (DE), were used to solve the optimization problem. Significantly, the method illustrates the trade-off between maximizing the NPV and optimizing the VRR. It calls the attention of both investors and regulatory agencies to the need to consider the financial aspect (NPV) and the environmental aspect (VRR) of waterflooding during secondary oil recovery projects. The multiobjective optimization approach meets the economic needs of investors and the regulatory requirements of government and environmental agencies. This approach gives a realistic NPV estimation for companies operating in jurisdiction with requirement for meeting a VRR of one.
Wu, JinYong (Schlumberger) | Banerjee, Raj (Schlumberger) | Bolanos, Nelson (Schlumberger) | Alvi, Amanullah (Schlumberger) | Tilke, Peter Gerhard (Schlumberger - Doll Research) | Jilani, Syed Zeeshan (Schlumberger Oilfield UK Plc) | Bogush, Alexander (Schlumberger)
Assessing the waterflood, monitoring the fluids front, and enhancing sweep with the uncertainty of multiple geological realisations, data quality, and measurement presents an ongoing challenge. Defining sweet spots and optimal candidate well locations in a well-developed large field presents an additional challenge for reservoir management. A case study is presented that highlights the approach to this cycle of time-lapse monitoring, acquisition, analysis and planning in delivery of an optimal field development strategy using multi-constrained optimisation combined with fast semi-analytical and numerical simulators.
The multi-constrained optimiser is used in conjunction with different semi-analytical and simulation tools (streamlines, traditional simulators, and new high-powered simulation tools able to manage huge, multi-million-cell-field models) and rapidly predicts optimal well placement locations with inclusion of anti-collision in the presence of the reservoir uncertainties. The case study evaluates proposed field development strategies using the automated multivariable optimisation of well locations, trajectories, completion locations, and flow rates in the presence of existing wells and production history, geological parameters and reservoir engineering constraints, subsurface uncertainty, capex and opex costs, risk tolerance, and drilling sequence.
This optimisation is fast and allows for quick evaluation of multiple strategies to decipher an optimal development plan. Optimisers are a key technology facilitating simulation workflows, since there is no ‘one-approach-fits-all' when optimising oilfield development. Driven by different objective functions (net present value (NPV), return on investment (ROI), or production totals) the case study highlights the challenges, the best practices, and the advantages of an integrated approach in developing an optimal development plan for a brownfield.
This paper presents a novel implementation for evolutionary algorithms in oil and gas reservoirs history matching problems. The reservoir history is divided into time segments. In each time segment, a penalty function is constructed that quantifies the mismatch between the measurements and the simulated measurements, using only the measurements available up to the current time segment. An evolutionary optimization algorithm is used, in each time segment, to search for the optimal reservoir permeability and porosity parameters. The penalty function varies between segments; yet the optimal reservoir characterization is common among all the constructed penalty functions. A population of the reservoir characterizations evolves among subsequent time segments through minimizing different penalty functions. The advantage of this implementation is two fold. First, the computational cost of the history matching process is significantly reduced. Second, problem constraints can be included in the penalty function to produce more realistic solutions. The proposed concept of dynamic penalty function is applicable to any evolutionary algorithm. In this paper, the implementation is carried out using genetic algorithms. Two case studies are presented in this paper: a synthetic case study and the PUNQ-S3 field case study. A computational cost analysis that demonstrates the computational advantage of the proposed method is presented.
The North Kuwait Jurassic Gas (NKJG) reservoirs are currently under development by KOC with assistance from Shell under an Enhanced Technical Services Agreement (ETSA). The fractured carbonate reservoirs contain gas condensate and volatile oil at pressures up to 11,500 psi with 2.5% H2S and 1.5% CO2. This paper describes the planning and implementation of a Well Integrity Management System (WIMS) that allows the safe management of the wells that are being drilled in this hazardous environment.
The wells are designed and constructed in accordance with KOC standards and on transfer of ownership from Deep Drilling Group to Production Services Group have their integrity managed under WIMS. The system is a structured process, relating the frequency and extent of routine monitoring and testing to the particular risks associated with the wells. Compliance with WIMS requirements are routinely reported so that all are aware of the current state of well integrity. WIMS is initially managed through simple spreadsheets and during 2012 is being integrated into KOC's Digital Field infrastructure.
Initially, WIMS has been applied to the range of wells ‘owned' by Production Services Group and tests currently carried out by Well Surveillance Group under PSG's direction. In order to realise the full assurance of safe operation the scope of WIMS application is being extended to the full well population, including suspended wells, and the full range of tests required.
Implementation of WIMS will allow KOC (NKJG) to be able to state that ‘our wells are safe and we know it'.
Ali, Zaki (Schlumberger) | G. Bonilla, Juan Carlos (Schlumberger) | Zolotavin, Andrey (Kuwait Oil Company) | Al-Shammari, Reem Faraj (Kuwait Oil Company) | Robert, Herric (Schlumberger) | Saleem, Hussain A. (Kuwait Oil Company) | Farid, Ahmad (Schlumberger)
As oilfields mature and new fields come into operation, real time asset management of reserves is providing ongoing challenges to Kuwait Oil Company (KOC). Fewer engineers are managing more wells under increasingly tougher environmental conditions and compliance regulations. The combination of these factors has driven the need for KOC to make a step change in its approach to operations by incorporating digital field concepts to transform the way engineers are working. The result is the Kuwait Intelligent Digital Field initiative.
To enable KwIDF, new technologies were deployed in both mature and immature assets, creating issues in terms of interoperability and integration thereby increasing the strain on the legacy IT infrastructure. In addition, there was the requirement to isolate the SCADA industrial networks from the corporate business networks while automating traffic control with the various enterprise data systems. This ‘managed' separation complicated the delivery of productivity tools to employees and posed the greatest challenge to creating a transparent, seamless KwIDF infrastructure.
The KwIDF Jurassic project was particularly challenging since it had the most limited existing infrastructure, requiring the design and deployment of an entirely new architecture scattered over significant distances and business areas. This in turn created significant hurdles in terms of integration and compatibility with the remainder of KOC's proprietary systems and technologies. Specific efforts were required to allow KOC's network infrastructure to be capable of embracing such solutions and technologies with proper security measures in place.
Developing a network infrastructure to enable real time solutions for KwIDF Jurassic involved analyzing the specific business drivers of the asset to ensure that the capital investment not only delivered results, but did so within a secure environment. This paper presents the methodology employed by KOC's Corporate IT Group (CITG) to deliver the right network infrastructure, along with lessons learned, for enabling the Kuwait Intelligent Digital Field Jurassic project.
The purpose of history matching is to achieve geological realizations calibrated to the historical performance of the reservoir. For complex geological structures it is usually intractable to run tens of thousands of full reservoir simulation to trace the most probable geological model. Hence the inadequacy of the history-matching results frequently leads to poor estimation of the true model and high uncertainty in production forecasting. Reduced-order modeling procedures, which have been applied in many application areas including reservoir simulation, represent a promising means for constructing efficient surrogate models. Nonlinear dimensionality reduction techniques allow for encapsulating the high-resolution complex geological description of reservoir into a low-dimensional subspace, which significantly reduces number of unknowns and provides an efficient way to construct a proxy model based on the the reduced-dimension parameters.
Polynomial Chaos Expansions (PCE) is a powerful tool to quantify uncertainty in dynamical system when there is probabilistic uncertainty in the system parameters. In reservoir simulation it has been shown to be more accurate and efficient compared to traditional experimental design (ED). PCEs have a significant advantage over other response surfaces as the convergence to the true probability distribution is proved when the order of the PCE is increased. Accordingly PCE proxy can be used as the pseudo-simulator to represent the surface responses of the uncertain variables. When the objective and constraints of a reservoir model is described by multivariate polynomial functions, there are very efficient algorithms to compute the global solutions. We have developed a workflow at which incorporates PCE to find the global minimum of the misfit surface and assess the uncertainty associated with. The accuracy of the PCE proxy increases with the additional trial runs of the reservoir simulator.
We conduct a two dimensional synthetic case study of a fluvial channel as well as a real field example to demonstrate the effectiveness of this approach. Kernel Principal Component Analysis (KPCA) is used to parameterize the complex geological structure. The study has revealed useful reservoir information and delivered more reliable production forecasts.
PCE-based history match enhances the quality and efficiency of the estimation of the most probable geological model and improve the confidence interval of production forecasts.
Significant advances have been made in formation testing since the introduction of wireline pumpout testers (WLPT), particularly with respect to downhole fluid compositional measurements. Optical sensors and the use of spectroscopic methods have been developed to improve sample quality and minimize sampling time in downhole environments. As a laboratory technique, spectroscopy is a ubiquitous and powerful technology that has been used worldwide for decades to measure the physical and chemical properties of many materials, including petroleum, geological, and hydrological samples. However, laboratory-grade, high-resolution spectrometers are incompatible with the hostile environments encountered downhole, at wellheads, and on pipelines. Only limited resolution techniques are available for the rugged conditions of the oil field. This paper introduces a new optical technology that can provide high-resolution, laboratory-quality analyses in harsh oilfield environments.
A new technology for optical sensing, multivariate optical computing (MOC), has been developed and is a non-spectroscopic technique. This new sensing method uses an integrated computation element (ICE) to combine the power and accuracy of high-resolution, laboratory-quality spectrometers with the ruggedness and simplicity of photometers. Many modern sensors typically merge the sensor with the electronics on an integrated computing chip to perform complex computations, resulting in an elegant yet simplistic design. Now, optical sensing using ICE features an analogue optical computation device to provide a direct, simple, and powerful mathematical computation on the optical information, completely within the optical domain. Because the entire optical range of interest is used without dispersing the light spectrum, the measurements are obtained instantly and rival laboratory-quality results.
A proof of concept MOC with ICE has been demonstrated, logging more than 7,000 hours, in nearly continuous use for 14 months. Oils with gravities ranging from 14 to 65°API have been measured in downhole environments that range from 3,000 to 20,000 psi, and from 150 to 350°F. Hydrocarbon composition measurements, including saturates, aromatics, resins, asphaltenes, methane, and ethane, have been demonstrated using the MOC configuration. As compositional calculations therein, GOR and density are validated to within 14 scf/bbl and 1%, respectively. The paper discusses the details of the new ICE-based sensor and describes its adaptations to downhole applications.