Kawaguchi, Kyojiro (Yokogawa Electric Corporation) | Takekawa, Masahiro (Yokogawa Electric Corporation) | Ohtani, Tetsuya (Yokogawa Electric Corporation) | Wada, Hidehiko (Yokogawa Electric Corporation)
This paper discusses on uncertainty and optimal sensor placement for estimating multi-phase flow rates from the producing well by inversely solving the well flow model, given the down-hole temperature and pressure measurements. Uncertainty of the estimated multi-phase flow rates are evaluated by solving the error propagation from the given uncertainties of pressure and temperature measurements used to make the estimation. The uncertainties of the multi-phase flow rate estimations are studied for different sensor combination and location to search for the optimal sensor placement which gives the lowest uncertainty for multi-phase flow rate estimations. From the simulation study, preferred sensor combination and location of pressure and temperature measurements are found for typical oil gas 2-phase vertical well case. The methodology can be applied for reliable down-hole multi-phase flow monitoring and optimal sensor placement design for monitoring the multi-phase flow rate using temperature and pressure measurements within an allowable estimation uncertainty and sensor cost.
Copyright 2013, Society of Petroleum Engineers This paper was prepared for presentation at the 2013 SPE Digital Energy Conference and Exhibition held in The Woodlands, Texas, USA, 5-7 March 2013. This paper was selected for presentation by an SPE program committee following review of information contained in an abstract submitted by the author(s). Contents of the paper have not been reviewed by the Society of Petroleum Engineers and are subject to correction by the author(s). The material does not necessarily reflect any position of the Society of Petroleum Engineers, its officers, or members. Electronic reproduction, distribution, or storage of any part of this paper without the written consent of the Society of Petroleum Engineers is prohibited. Permission to reproduce in print is restricted to an abstract of not more than 300 words; illustrations may not be copied. The abstract must contain conspicuous acknowledgment of SPE copyright. Abstract Distributed Acoustic Sensing (DAS) is a newly developed passive fiber optic sensing technology. It has the ability to detect acoustic signals anywhere along the length of the fiber with a high frequency response and a tight spatial resolution. DAS has a bright future within the upstream oil and gas industry though it is currently not well understood as it is such a new technology. Next, the paper will discus both proven and unproven applications of DAS technology.
Copyright 2013, Society of Petroleum Engineers This paper was prepared for presentation at the 2013 SPE Digital Energy Conference and Exhibition held in The Woodlands, Texas, USA, 5-7 March 2013. This paper was selected for presentation by an SPE program committee following review of information contained in an abstract submitted by the author(s). Contents of the paper have not been reviewed by the Society of Petroleum Engineers and are subject to correction by the author(s). The material does not necessarily reflect any position of the Society of Petroleum Engineers, its officers, or members. Electronic reproduction, distribution, or storage of any part of this paper without the written consent of the Society of Petroleum Engineers is prohibited. The abstract must contain conspicuous acknowledgment of SPE copyright. Abstract In recent years there has been a significant growth in the number of Permanent Downhole Measurement System (PDHMS) installations in oil and gas fields around the world, as PDHMS prices have been falling steadily and their reliability has also been increasing. However, the full benefits of this investment can only be realized when it is taken from simple surveillance and monitoring to a source for reservoir characterization. This study assesses the systems and workflows that have been put in place to transform the massive amounts of data, including pressure, flow rate and temperature, into actionable information to improve field development and performance. This study presents a dynamic real-time well testing workflow for utilizing data from from intelligent fields. It discusses the applicability of pressure transient analysis using I-Field data from Permanent Downhole Gauges to characterize reservoir and well performance. While seeking to develop an efficient workflow for the utilization of the real-time data for pressure transient analysis, the study also seeks to assess the effects of producing time, flow rate simplification, and data interruptions on the reliability of the analysis methods and results. Actual real-time pressure data from PDHMS, and flow rate data Multi-Phase Flow Meters (MPFM) were used in the analyses, to determine reservoir parameters and evaluate well performance.
A frequent problem experienced throughout industry is that of missing or poor quality data in data historians. This can have various causes, such as field instrument failures, loss of communication, or even issues with the setup of the historian itself. The end result is that data required to perform analyses needed to improve facility operations may be unavailable. This generally incurs delays, as the data analyst must manually "clean up?? the data before using it, or could even result in erroneous conclusions if the data is used as is without any corrections. In this paper, a novel multivariate statistical method is proposed to detect incorrect data values and reconstruct corrected values to be stored in the historian. This method works on streaming data, and thus makes its corrections continuously in near real-time. The method has been successfully tested in a laboratory setting using real operating data from a Chevron facility. Chevron plans to test the data error detection and reconstruction method in the field in the near future. Use of this method will ensure that good quality data for needed analyses is available in the data historian, and will save analyst time as well.
Some Majors have indicated that they will drill up to 20,000 new wells by the end of the decade. To achieve well numbers and cope with the work load and manpower shortages, drilling automation technologies are being introduced that will allow multiple rigs to be operated remotely and downmanned. It is premised that well production operations will also have to be automated to cope with increasing work load as resource will not be available to operate these wells in the traditional, manual way. Associated production processes that will have to be automated include, well surveillance/optimization, well testing, sampling, chemical injection and hydrocarbon accounting. It is desirable to bring the wells onstream as soon as possible after drilling and to maintain production rates at as high a level as possible and to simultaneously account for production of oil, gas and water. It also is imperative to ensure the highest level of safety and environmental factors. Hence, the purpose of this paper is to describe well automation business requirements/benefits and potential system solutions to optimize production over the life cycle for the ever increasing number of onshore wells that will be drilled in the near future.
Vanish, Dhruv (Halliburton) | Betancourt, Dayanara (Halliburton) | MdAdnan, Shikin (Halliburton) | Wang, Feng (Halliburton) | Cullick, Alvin Stan (Berry Petroleum Company) | Al-Jasmi, Ahmad Khalid (Kuwait Oil Company) | Villamizar, Miguel (Halliburton) | Nasr, Hatem (Kuwait Oil Company) | Carvajal, Gustavo A (Halliburton) | Urrutia, Karelis (Halliburton) | Velasquez, Guillermo (Halliburton)
Intelligent digital oilfield (iDOF) operations include the transfer, monitoring, visualization, analysis, and interpretation of real-time data. Enabling this process requires a significant investment to upgrade surface, subsurface, and well instrumentation and also the installation of a sophisticated infrastructure for data transmission and visualization. Once upgraded, the system then has the capability to transfer massive quantities of data, converting it into real information at the right time.
The transformation of raw data into information is achieved through intelligent, automated work processes, which are referred to here as "smart flows," which assist engineers in their daily well surveillance activities, helping make them more productive and improve decision making. A major oil and gas operator in the Middle East has invested in such an infrastructure and is developing a set of smart flows for key activities and workflows for its production operations, with the ultimate goal of improved asset performance.
The Sabriyah Mauddud limestone of north Kuwait is a giant depletion-drive oil reservoir currently under the waterflooding process and is being monitored as part of this iDOF operation. A "smart flow" has been developed for visualization and analysis of the waterflooding performance that uses real-time production data to derive and update daily water saturation and reservoir pressure from numerical simulation. The ultimate goal of the smart flow is to identify and define actions to improve water injection and production over the long term.
This paper explains how this smart flow works—how it automatically runs a reservoir simulator to estimate water saturation and reservoir pressure and iteratively calculates the waterflooding indicators to provide advice to change the water injection on a monthly basis.
Copyright 2013, Society of Petroleum Engineers This paper was prepared for presentation at the 2013 SPE Digital Energy Conference and Exhibition held in Thee Woodlands, Texas, USA, 5-7 March 2013. This paper was selected for presentation by an SPE program committee following review of information contained in an abstract submitted by the author(s)). Contents of the paper have not been reviewed by the S ociety of Petroleum Engineers and are subject to correction by the a uthor(s). The material does not necessarily reflect any position of the Society of Petroleum Engineers, its officers, or members. Electronic reproduction, distribution, or storage of any part of this paper without the written consent of the Society of Petroleum Engineers is prohibited. Permission to reproduce in print is restricted to an abstract of not more than 300 words; illustrations may not be copied. The abstract must contain conspicuous acknowledgment of SPE copyright. Abstract In most electricc submersible pump (ESP) applications, a great deal of time and effort iss spent upon the initial sizing of the ESP - projecting which equipment is most appropriate for the physical properties, reservoir delivery, and operator economics of the particular application. After the ESP is designed, ordered, and installed, however, the sizing is often placed in a repository and not referred to again until the ESP has failed and the next t ESP is being designed.
We present a novel methodology for integration of high-angle/horizontal (HA/HZ) well data into 3D geomodels as a natural extension to well placement workflows. Log interpretation, typically done in 2D cross-sections, is based on 1-D automated inversion, yielding near-wellbore reservoir structure and properties. 2D cross-section, updated by inversion and further refined using 2D/3D modeling, is subsequently retrofitted into the geomodel so as to minimally perturb the original topology. Changes in positions/dips/azimuths of boundaries and faults, cell properties, and further local grid refinements are applied automatically. The updated geomodel honors high-resolution logs and low-resolution seismic/nearby wells data.
We demonstrate this on a typical real-time well placement scenario. Electromagnetic log interpretation codes are integrated as a high performance computing (HPC) Web service into a geosteering/model update workflow. As the initial model, we use 3D geomodel constructed from seismic/vertical well data. A "curtain?? cross-section is extracted, edited based on 1D inversion, and further refined to match HA/HZ logs through 2D and 3D forward modeling, by changing properties/dips/layer thicknesses /fault positions, while preserving the original topology. Then, updated node coordinates and cell properties of the affected pillar grid region are calculated to optimally retrofit the changed 2D cross-section into the grid. These changes are then automatically applied to the 3D model. For quality control, we recompute the 2D cross-section from the refined geomodel. Ultimately, we arrive at the geomodel that honors both seismic and resistivity well-log data. The combination, in a single workflow, of physics-based log modeling codes, Services-Oriented Architecture, HPC framework, and the solver to optimally retrofit 2D cross-sections into 3D models, creates a qualitatively new opportunity for well placement engineers.
This integrated workflow (1) maximizes the value of deep directional resistivity well-logs and real-time well placement interpretation by incorporating them into the source of data for building geomodels; (2) radically speeds up the model refinement loop by automatically calculating and applying the modifications to 3D reservoir model; (3) enables geoscientists to directly refine geomodels while geosteering. The latter has not been a standard practice, hindered by challenges of scale difference between geomodels and well-logs and lack of availability of efficient modeling codes.
The invent of wired pipes technology opens new avenues for decision making in drilling operations while rising new challenges for making sense of the deluge of delivered measurements. Due to the high costs of drilling, fastness and safety are the two most forsaken requirements. The wired pipes technology inaugurates a dramatic change in the way drilling is currently performed. In fact, down-hole measurements will be available at a higher data rate alleviating the need for the traditional model based measurements, different sensors will be distributed along the miles of pipe posing challenge to fusion them for decision making. In addition to handling the overwhelming quantity of information, user friendly human machine interfaces should be developed and the driller should be trained to cope with the new ways of presenting the drilling data. Exploiting the latent benefits of wired pipes requires an interdisciplinary research that combines different fields of science namely data mining, control theory, artificial intelligence and statistics. In this paper, we will give insights into our effort centered on devising a comprehensive real time decision making mechanism that resorts to wired pipes. The main idea behind is to automatize many of the manual operations in drilling so that to achieve higher speed drilling. The paper will mainly focus on optimizing the rate of penetration, the control of pressure within the pressure window, optimizing the tripping in/out speed and managing the pump start-up. Accurate estimation methods of the remaining life of the drilling bit should be developed in order to avoid unnecessary tripping. The paper will also give insights into Artificial Intelligence based paradigms for predicting and detecting drilling incidents such as loss of circulation and hole pack-off.
The abundance in digital calibration information can be leveraged through statistical mining to improve calibration methodologies in tool manufacturing. Unable to store massive data, historical tool metrology relied on easily processed, summary performance metrics. However, with modern computing power, more computationally expensive methods of large data sets can be used to refine tool and process metrology, accuracy, and reliability. Empirical data from large sets of quartz memory gauges suggest that tools are consistently outperforming the accuracy specification derived from historical static calibration. This paper presents an alternative metrological methodology that provides additional insight into the accuracy of calibrated gauges. An analytic framework based on the Satterthwaite confidence interval (CI) is developed using an unpaired sample t-test and simultaneous field measurements from five stable pressure segments are used to validate the predicted variances between gauges. The methodology significantly improves measurement accuracy and reliability and is applicable to any instrumentation that undergoes a consistent, repeatable, and linear calibration.
In addition to reevaluating existing tool specifications, the methodology can be integrated into metrological practices in the oil industry where large digital data sets and painstakingly accurate tools and calibration devices are commonly found. Presented are results from memory gauges, but by extension the method is applicable in other formation evaluation endeavors. The analytic framework predicts a factor of proportionality between the accuracy predicted by the CI and the traditional metrics of individually calibrated gauges. Specifically, a relationship is found between the 2-norm of the mean quadratic deviation (MQD) error of two individually calibrated pressure gauges. A case study of HPHT gauges in five well tests in the North Sea and India confirm analytic predictions. The paper focuses on quartz memory gauges and improves absolute and relative accuracies of HPHT gauges by over 33% and 56%, accordingly. Comparable analysis can be applied to other tools to better evaluate metrology. The converse method is useful to determine a process' variance and bias — vital information to evaluate and compensate a facility, process, or device.