The effectiveness of secondary and tertiary recovery projects depends heavily on the operator's understanding of the fluid flow characteristics within the reservoir. 3D geo-cellular models and finite element/difference-based simulators may be used to investigate reservoir dynamics, but the approach generally entails a computationally expensive and time-consuming workflow. This paper presents a workflow that integrates rapid analytical method and data-analytics technique to quickly analyze fluid flow and reservoir characteristics for producing near "real-time" results. This fast-track workflow guides reservoir operations including injection fluid allocation, well performance monitoring, surveillance, and optimization, and delivers solutions to the operator using a website application on a cloud-based environment. This web-based system employs a continuity governing equation (Capacitance Resistance Modelling, CRM) to analyze inter-well communication using only injection and production data. The analytic initially matches production history to determine a potential time response between injectors and producers, and simultaneously calculates the connectivity between each pair of wells. Based on the inter-well relationships described by the connectivity network, the workflow facilitates what-if scenarios. This workflow is suitable to study the impact of different injection plans, constraints, and events on production estimation, performance monitoring, anomaly alerts, flood breakthrough, injection fluid supply, and equipment constraints. The system also allows automatic injection re-design based on different number of injection wells to guide injection allocation and drainage volume management for flood optimization solutions. A field located in the Midland basin was analyzed to optimize flood recovery efficiency and apply surveillance assistance. The unit consists of 11 injectors and 22 producers. After optimization, a solution delivering a 30% incremental oil production over an 18-month period was derived. The analysis also predicted several instances of early water breakthrough and high water cut, and subsequent mitigation options. This system couples established waterflood analytics, CRM and modern data-analytics, with a web-based deliverable to provide operators with near "real-time" surveillance and operational optimizations.
Liu, Guoxiang (Baker Hughes a GE Company) | Stephenson, Hayley (Baker Hughes a GE Company) | Shahkarami, Alireza (Baker Hughes a GE Company) | Murrell, Glen (Baker Hughes a GE Company) | Klenner, Robert (Energy & Environmental Research Center, University of North Dakota) | Iyer, Naresh (GE Global Research) | Barr, Brian (GE Global Research) | Virani, Nurali (GE Global Research)
Optimization problems, such as optimal well-spacing or completion design, can be resolved rapidly via surrogate proxy models, and these models can be built using either data-based or physics-based methods. Each approach has its strengths and weaknesses with respect to management of uncertainty, data quality or validation. This paper explores how data- and physics-based proxy models can be used together to create a workflow that combines the strengths of each approach and delivers an improved representation of the overall system. This paper presents use cases that display reduced simulation computational costs and/or reduced uncertainty in the outcomes of the models. A Bayesian calibration technique is used to improve predictability by combining numerical simulations with data regressions. Discrepancies between observations and surrogate outcomes are then observed to calibrate the model and improve the prediction quality and further reduce uncertainty. Furthermore, Gaussian Process Regression is used to locate global minima/maxima, with a minimal number of samples. To demonstrate the methodology, a reservoir model involving two wells in a drill space unit (DSU) in the Bakken Formation was constructed using publicly available data. This reservoir model was tuned by history matching the production data for the two wells. A data-based regression model was constructed based on machine learning technologies using the same dataset. Both models were coupled in a system to build a hybrid model to test the proposed process of data and physics coupling for completion optimization and uncertainty reduction. Subsequently, Gaussian Process Model was used to explore optimization scenarios outside of the data region of confidence and to exploit the hybrid model to further reduce uncertainty and prediction. Overall, both the computation time to identify optimal completion scenarios and uncertainty were reduced. This technique creates a robust framework to improve operational efficiency and drive completion optimization in an optimal timeframe. The hybrid modeling workflow has also been piloted in other applications such as completion design, well placement and optimization, parent-child well interference analysis, and well performance analysis.
In Vietnam, there was a need of a lean surface casing due to restricted drift inside diameter (ID). The 2nd slot of the splitter conductor only have 13-1/2" ID max pass through. The practical option is to drill with 12-1/4" bit and open to 14-1/2" hole to set 11-3/4" casing OD. Similar reasoning for the intermediate hole that will require to under ream the hole from 10-5/8" bit to 12-1/4" hole and set 9-5/8" casing OD. Although these under reaming operations are commonly practiced, the technical limitations are still inefficient and compromising. Conventional reamers still have limited activation/deactivation cycle for operational flexibility and long rathole of the reamer to bit depth for casing shoe placement.
The long awaited technology is now available with the presence of intelligent reamers that have unlimited activation & deactivation cycles and can be placed directly above the rotary steerable system for shortest possible rathole. The setup is to combine two intelligent reamers in a single BHA. The 1st reamer placed strategically on top of the MWD & LWD tools while the 2nd reamer is directly above the rotary steerable system tool. As both reamers can be both activated and deactivated through downlinking, the reamer has to be activated simultaneously to control the risks associated with hole opening and LWD data acquisition. The 1st intelligent reamer will be activated first while drilling the section formation and the 2nd intelligent reamer will then be activated at section TD to ream and shorten the rathole. For the purpose of cleaning the hole effectively, both reamers can be deactivated to execute high flow and RPM without creating new cuttings from the reamer blades and avoid making a bigger hole at the low side.
This enabled shoe to shoe drilling while under reaming and achieving less than 10m rathole. These operational capabilities saved at least 50% of the section rig time compared to having a 2 trip system. Combination of reduced casing shoe rathole and open hole exposure mitigated the well bore instability risks and helps in managing mud weight for both hole section intervals. The unlimited activation cycle provided flexibility in operations particularly in dealing with hole cleaning and wiper trips. Plus, the intelligent reamer provides realtime reamer diameter which gives confidence on the drilled hole size for casing running preparation and decisions.
Intelligent reamers have unique tool features that differentiate from the rest of current industry technologies. This feature helps to eliminate the risk of under-reamer balling, which improve the rate of penetration. The success of the operation has spread throughout operators in Vietnam, and now the intelligent reamer is considered as a game changer application in drilling lean casing profiles.
Padron, Tomas (Baker Hughes a GE Company)
Research and field experience have given technical bases to accept the use of CT80, 90, and 100 coiled tubing (CT) for sour service, including their limitations. New higher strength Grades are now available (CT110 to CT140), but there is not enough awareness about potential issues, suitability, and boundaries regarding their use in H2S environments. This paper presents the potential issues and indicates the aspects that should be considered for the qualification of new CT Grades.
This paper gathered research data from the review of technical publications related to CT and sour service since 1982, and also included field experiences regarding the use of CT in wells containing H2S, in order to establish the potential problems of working with CT in sour environments, and define what should be considered for the qualification of the new CT Grades available in the market, including the testing options that will cover the aspects needed for that qualification.
The analysis of the information allowed establishing that the potential problems of using CT in sour service include the reduction of fatigue life available and different forms of H2S cracking. Considering this, it was established that the aspects that should be included for qualifying a CT Grade for sour service are: sour fatigue performance for pipe and bias weld, boundaries limits regarding H2S partial pressure, and finally the effectiveness of anti-cracking inhibitors for protecting the pipe against H2S cracking. The testing options proposed include bending fatigue testing of CT samples after being exposed to H2S, and small scale samples NACE TM0177 testing (Methods A, B, or C).
More awareness about potential issues of using high strength CT Grades in sour service is required, and also there is no formal industry guidelines regarding their qualification for sour service. This paper summarizes the potential problems that can occur from the exposure of CT to H2S (lab data and field failures examples) and establish what should be considered for the qualification of a CT Grade to be used in sour service, including the testing options.
Although much has been written about methods for estimating and interpreting log measurements, new emphasis must be given to data quality as data-mining methods become more prevalent. These methods are highly dependent on the quality of the original acquired data sets. Wireline and logging-while-drilling (LWD) technologies have advanced to a level where today's analysts frequently assume acquired measurements are correct. Unless problems are encountered in integrating the data, the problem may not be recognized in subsequent data-driven analytics (DDA). Assumptions regarding the quality of modern data are generally valid, but often fail when conditions within the borehole degrade to the point of falling outside the physical measurement limitations of the instruments, or when mixing older data from different tool designs with modern data.
When wellbore conditions reach a point where data degradation occurs, the information must be corrected for environmental and borehole geometry effects or, in extreme cases, the data must be reconstructed. Data reconstruction/estimation can take many forms including translation application of regional trends, transformation of one type of measurement into another, extrapolation of offset well data to the well of interest, use of offset openhole data combined with cased-hole data in predicting the measurements on adjoining wells when only the cased-hole data are available, extraction of measurements from seismic etc. Methods involved in these endeavors vary from empirical algorithms to regional trend analysis, to statistical inference, to neural nets, to data mining and other forms of machine learning. Successful application of any method requires data that are representative of the formations when acquired under optimal conditions to assure their optimum use in defining the proper representation of the data to be derived. New evaluation technologies (log, lab, algorithms) must be assessed prior to pinning data quality on these new technologies.
Interpretation algorithms applied to the data are no different, in that data quality is assumed for the analysis models to function correctly. Proprietary internal and third-party external interpretation packages all have problems when the data quality suffers.
Proper data reconstruction requires an understanding of the quality of the acquired data (calibrations, accuracy etc.), the instrument configuration in the toolstring, the acquisition methodology and the condition of the well bore environment when the data were acquired.
We will examine the application of several methods used in data preconditioning and data reconstruction, along with some novel statistical methods the authors employ, with examples in various environments. Validation of the reconstructed data sets is also demonstrated. With the digital oil field (DOF) accessing single and multiple data stores involving data extraction for geological to engineering applications, quality and meaningful log and petrophysical data must be a part of these data stores. Recommendations for data-quality assurance before being employed in mining databases are provided.
The oil industry has been experiencing issues with "slugs" of gas that limit the ability of the pump to create head and causes a low amp or underload shutdown. Baker Hughes has developed special gas inlets to improve flow in high gas wells. Electric submersible pump (ESP) performance is evaluated with a new "Slug" test to determine how a "slug" (or short burst) of gas affects the output flow.
Several different inlet devices are tested with a mixed flow pump in a high gas environment using water and nitrogen. An annular ring introduces gas below the pump inlet and partial flow is allowed to bypass the pump inlet to simulate real well conditions. Three inlet devices are compared to a standard pump inlet to determine how long it takes a slug of gas to terminate flow. In addition, these inlet devices are tested at steady state conditions with different flow rates and gas volume fractions (GVF).
The "gas slug handling" capabilities of a standard mixed flow pump stage is compared with three different inlet devices. The test is repeated at several different flow rates to document the time required for a slug of gas to stall the system. A standard pump running at 1000 bpd stalls in 20 seconds as compared to 70 seconds on the best inlet when a slug of 95% gas is introduced. When the system is tilted 30 degrees off vertical, a slug of gas will cause them all stall in 30 to 41 seconds.
Steady state gas tests were conducted at a variety of gas volume fractions to find the resulting output flow and pressure for each inlet device at flows between 500 and 2500 barrels per day. The resulting gas curves are presented for each inlet device with the system running in a vertical and 30 degrees off vertical. One of the inlets was able to generate four times the output pressure of a standard pump at 40% gas volume fraction. The production inlet ran better than any other inlet when the system is tilted 30 degrees off vertical.
Summary: A LWD prototype multipole source has been characterized in a water tank. The experimental setup and conducted measurements are briefly discussed and put in the context of an existing theoretical frame work. Far field directionalities in terms of azimuth and inclination are compared to theory and differences (where present) are scrutinized and explained. Measured far field multipole waveforms are compared to synthetic waveforms, where the latter assumes a known actuator model. Exemplary results are presented for the monopole excitation and are in good agreement with the earlier mentioned theory.
Klenner, Rob (Baker Hughes a GE Company) | Liu, Guoxiang (Baker Hughes a GE Company) | Stephenson, Hayley (Baker Hughes a GE Company) | Murrell, Glen (Baker Hughes a GE Company) | Iyer, Naresh (GE Global Research) | Virani, Nurali (GE Global Research) | Charuvaka, Anveshi (GE Global Research)
Frac hits are a form of fracture-driven interference (FDI) that occur when newly drilled wells communicate with existing wells during completion, and which may negatively or positively affect production. An analytics and machine-learning approach is presented to characterize and aid understanding of the root causes of frac hits. The approach was applied to a field data set and indicated that frac hits can be quantitatively attributed to operational or subsurface parameters such as spacing or depletion. The novel approach analyzed a 10-well pad comprising two ‘parent’ producers and eight ‘child’ infills. The analysis included the following data types: microseismic, completion, surface and bottomhole pressure, tracers, production, and petrophysical logs. The method followed a three-step process: 1) use analytics to assess interference during the hydraulic fracturing and during production, 2) catalogue or extract feature engineering attributes for each stage (offset distance, petrophysics, completion, and depletion) and 3) apply machine-learning techniques to identify which attributes (operations or subsurface) are significant in the causation and/or enhancement of inter-well communication. Information fusion with multi-modal data was also used to determine the probability of well-to-well communication. The data fusion technique integrated multiple sensor data to obtain a lower detection error probability and a higher reliability by using data from multiple sources. The results showed that the infill wells completed in closest proximity to the depleted parents exhibit strong communication. The machine-learning classification creates rules that enable better understanding of control variables to improve operational efficiency. Furthermore, the methodology lends a framework that enables the development of visualization, continuous learning, and real-time application to mitigate communication during completions.
The continuous evaluation of electric submersible pumps (ESP) performance and its integration within the energy management system in the operation of Blocks 16 & 67 of Ecuador allowed to develop a specific procedure to assess the efficiency of ESPs in order to identify energy improvement opportunities. This procedure is summarized with the Significance Matrix, a tool that integrates ESP's field data such as production history and electrical measurements to estimate hydraulic and electrical power requirements, and a specific method for efficiency evaluation. The result of the Significance Matrix is the categorization of each ESP system as significant or nosignificant use of energy. The significant uses of energy are further analyzed to design an action plan that is prioritized through technical and economic assessment. The comparison between the Significance Matrix of December 2014 and November 2017 showed that the optimization of eight (8) ESP systems resulted in an average reduction of 49 BDPD in fuel consumption that can be translated to a decrease in greenhouse gases emission of 20 tCO2 per day.
The deposition of carbonate and sulphate scales is a major problem during oil and gas production. Managing scale with chemical application methods involving either scale prevention and/or removal are the preferred methods of maintaining well production. However, chemical scale control is not always an option, depending upon the nature of the reservoir and well completion and, in cases of severe scaling, the problem can render chemical treatments uneconomic unless other non-chemical methods are utilised.
A variety of non-chemical scale control methods exist, the most common being injection of low salinity brines or low sulphate seawater (LSSW) using reverse osmosis and a sulphate removal plant (SRP) respectively. In addition, careful mixing of lift gas, produced waters and reinjection, coatings, smart well completions with active inflow control devices (ICD) and sliding sleeves (SS) are other methods.
All of these techniques, including combinations thereof, are currently in use and the advantages and disadvantages of the key techniques are compared to chemical methods for both carbonate and sulphate scale control. A detailed example from a North Sea field demonstrates where downhole chemical scale control has not been required through a strategy of careful mixing of lift gas, brines and produced water re-injection. This was combined with understanding fluid flow paths in the reservoir and their likely breakthrough at production wells.
Consideration is given to the injection of smart brines to scale deep in the reservoir, and data from North Sea chalk fields shows how "
This paper presents a comprehensive review of non-chemical methods for downhole scale control and discusses how the use of these techniques can provide alternative scale management strategies through minimising or alleviating the need for downhole chemical treatments.