A hybrid model based on Physics of failure and Data-driven algorithms is developed that can estimate remaining useful life of production casing (well barrier). The state of integrity of the well barrier is assessed by updating the reliability under operational loads.
The interactions between the casing and surrounding formation, and effects of tribocorrosion on the casing are considered. Tribocorrosion is the process of degradation of a material resulting from a sequential process of (i) mechanical wear (due to sliding, friction, or impact) followed by (ii) a corrosive action of the surrounding environment. The model includes simulating casing wear due to drilling, and enhanced degradation due to conditions in the well.
The main capability of the model is to help well integrity analyst with insight of future health states of a monitored well. This is achieved in two main steps; the first being the offline module comprised of degradation models. The second is the pattern recognition based on well log and features mapping, and estimation of remaining useful life of well barrier. The production casing grade P-110 undergo reduction in strength due to wear during drilling, induced stress and hydrogen induced cracking. The remaining useful life is calculated for the depths of interest and time.
A comparative analysis is carried out using the industry standard soft-string model versus a more comprehensive stiff-string model to estimate wear. The paper presents a unique approach to predict the remaining useful life of a well barrier and the dynamic state of the well's operational integrity. The prediction is not solely based on statistical modeling but also incorporates barrier engineering and physics of failure in the model.
Liu, Yu (Shell International Exploration and Production Inc) | Kibbey, Justin (Shell International Exploration and Production Inc) | Bai, Yanbin (Shell International Exploration and Production Inc) | Wu, Xianping (Shell International Exploration and Production Inc)
Severe bit damage is a known issue in west Texas land drilling due to abrasive sand formation and interbedded hard stringers. Operational performance and rig cost are often impacted by penalty fee of bit DBR (damage beyond repairable), low ROP (rate of penetration) with worn bit, and inefficient decision-making on tripping. A real-time data analytics application is developed aiming to provide actionable information to operation to expedite decision making process.
A historical dataset of surface mechanics data and bit records is collected from 40 bit runs drilled in 2016 and early 2017. A hybrid data analytics procedure consisting of conventional physical modeling of drilling mechanics and supervised learning using machine learning technique is conducted to separate bit failure pattern from normal formation transition and drilling parameters adjustment. A metric based algorithm is constructed for real-time monitoring of bit drilling performance and early warning on bit cutter wear conditions.
A web-based real-time software is developed and field trialed on three wells with satisfactory results. Subsequent deployments in DART (Drilling Automation Remote Technology) center and field offices have been quickly rolled out for five rigs in west Texas. Positive feedback is generated from operation and engineers. Attributed to the success of agile development framework and adaptive software architecture, other advisory mode features such as motor life monitoring, smart-tripping evaluation, and sliding diagnosis etc. are under development.
The application discussed in this paper combines expert's domain knowledge with machine learning techniques and provides actionable information to support on-site operational decisions. The development and deployment of this application follows an agile mode innovation framework, through which operational need and technical solution are quickly bridged and tangible business value is able to be delivered in short term.
Multi-finger caliper tools are widely used for inspection of casing inner wall condition, particularly for determining wear, corrosion, deformation and scaling. The maximum pipe wall penetration, and hence metal loss, inferred from this method is greatly dependent on the casing inner diameter value (provided by casing manufacturer) used as a reference. Evidence of overestimated casing wear, including new completions, has raised the question around the validity of such interpretations.
Most of the casing is manufactured to API 5CT (ISO 11960) standard. The manufacturer‘s specification has an associated tolerance on casing outer diameter and thickness that is then used in the multi-finger caliper logs interpretation. The tolerance for casing with outer diameter (OD) greater or equal than 4.5″, is between -0.5% and +1% of OD. The pipe wall thickness tolerance ranges between manufacturer's nominal thickness and 12.5% metal loss.
The calculation of casing internal diameter (ID) tolerance is a difficult task as this is controlled by the casing mass tolerance which in practice is not easy to measure. To overcome the challenge, we implemented a new approach of calculating the minimum and maximum casing inner diameter and then using the ID tolerance as a reference for multi-finger caliper logs interpretation.
This approach has been tested with field data, and shows good correlation to actual casing inner diameter measurements performed at surface and is used as a baseline for new casing strings.
This paper will describe in detail the approach taken and compare the difference between interpretation outcomes with and without using the casing inner diameter tolerance. The benefits of calculating casing inner diameter tolerance and applying it to multi-finger caliper log interpretation results will be backed up with evidence from field log data acquired in ACG.
An advanced stiff-string model used during casing wear simulation to evaluate the drillstring contact position at any given casing depth can reflect the actual downhole scenario more precisely. The model yields a more representative wall thickness reduction percentage along the casing string by estimating multiple groove positions at each cross section. The simulated casing wear severity is subsequently incorporated into tubular design calculations by amending critical dimensions for each stress criteria to represent the worn tubular operating limits. The integration process reduces the safety margin for the modeled loading conditions that the casing string is expected to withstand during the well's life cycle.
This paper highlights the different methodologies applied during casing wear simulation and discusses how to integrate the results into the casing design and stress analysis workflow to predict the critical failure mode for the investigated tubular string.
This paper discusses a new and general method of backup cutter layout to extend bit life without sacrificing rate of penetration and two field case studies. This method includes the following aspects: Ensuring backup cutters do not cut or only partially cut when their primary cutters experience little to no wear and when the depth of cut of the bit does not exceed an expected value. This aspect is enabled by allowing backup cutters to have a minimal critical depth of cut that is greater than the depth of cut of the primary cutters. Ensuring backup cutters act as major cutters when the primary cutters' wear depth is equal to or greater than the underexposure of the backup cutters. This aspect is enabled by allowing each backup cutter be rotationally behind its primary cutter by approximately 180° or greater. The underexposure of each backup cutter relative to its primary cutter is carefully calculated based on the primary cutter's wear and drilling slope.
Ensuring backup cutters do not cut or only partially cut when their primary cutters experience little to no wear and when the depth of cut of the bit does not exceed an expected value. This aspect is enabled by allowing backup cutters to have a minimal critical depth of cut that is greater than the depth of cut of the primary cutters.
Ensuring backup cutters act as major cutters when the primary cutters' wear depth is equal to or greater than the underexposure of the backup cutters. This aspect is enabled by allowing each backup cutter be rotationally behind its primary cutter by approximately 180° or greater. The underexposure of each backup cutter relative to its primary cutter is carefully calculated based on the primary cutter's wear and drilling slope.
As the oil and gas well geometry becomes more complex, identifying the locations of critical bit wear becomes increasingly difficult. The analytical models currently used in the drilling industry are incapable of completely realizing the downhole dynamics and thus, are incapable of bit wear prediction. Current models either overestimate or are too conservative, while some models necessitate too many parameters, constants, and coefficients. There is no acceptable universal mathematical model that describes bit wear; thus, this paper proposes a method that does not use a model at all. This study focuses on harnessing the impact of real-time downhole dynamics on drilling parameters to predict bit wear using adaptive data analysis techniques.
This paper identifies a parameter that demonstrates the impact of downhole dynamics with a high degree of functionality. Though other parameters can also be used, real-time rate of penetration (ROP) is the basic parameter always available. Depending on the parameters available, different parameters can be combined and entered into the algorithm. For example, ROP and gamma ray data can be convoluted in an appropriate manner to predict bit wear and whether the bit wear is caused by formation change.
This work proposes an adaptive data analytics algorithm. This algorithm decomposes the continuously entered incremental ROP data that come in the form of wellsite information transfer standard markup language (WITSML), at each depth into its intrinsic mode functions (IMFs) using empirical mode decomposition (EMD). The trend of the final IMF at each depth is continuously monitored to predict bit wear. For the practical purpose of making the entire process fast and straightforward, the energy of the final IMF (IMF energy) is also continuously computed. This algorithm can be implemented using incremental depth analysis and incremental depth interval analysis. The depth interval analysis can be used during the preplanning stage of the drilling operations to identify the intervals in which the field personnel should be more cautious. The incremental depth analysis can be used during drilling operations.
The proposed workflow was tested on historical wells, and the results were compared with analytical models
The two-stage condition monitoring approach takes account of the fact that, due to the advent of improved lubricant formulations and more efficient filter designs, conventional laboratory oil testing has recently become less useful for extracting timely condition monitoring information. The new approach uses the latest technological advances in imaging hardware combined with ASTM D7684-11 compliant wear debris particle analysis software containing an extensively researched knowledge base that has captured the diagnostic skills and experience of a number of expert wear debris analysts, each with decades of hands-on experience. It offers a cost effective, on-site diagnostic capability to rival that of most specialist labs. First stage testing uses the latest computer vision technology to visualise fine debris. This innovative digital imaging hardware enables on-site maintenance professionals not only to reliably size and count but also to analyse wear debris particles as small as 5 microns, offering timely equipment health information that few laboratories can match. The second stage consists of on-site, in-service filter analysis triggered by the appearance of abnormalities in the fine debris particles during first stage analyses. The innovative, diagnostic wear debris particle analysis software then uses the five level severity rating advocated by the ASTM D7684-11 standard guide such that timely alerts do not allow wear to escalate to a critical level. The paper includes the results of a forensic case study illustrating the way in which a catastrophic bearing failure, costing millions of euros in critical equipment down-time could easily have been avoided had the two-stage condition monitoring methodology been applied. This new approach has the potential to avoid costly, unscheduled, equipment down-time due to the unpredicted failure of critical equipment or equally costly false alarms when equipment is unnecessarily removed from service. This is accomplished by extracting information concerning equipment health from fine wear debris at an early stage in the wear process, where such information has previously only been available by the analysis of large wear debris particles at a much later stage in the equipment wear cycle.
Matsumoto, Keishi (Nippon Steel and Sumitomo Metal Corporation) | Sagara, Masayuki (Nippon Steel and Sumitomo Metal Corporation) | Miyajima, Makoto (Nippon Steel and Sumitomo Metal Corporation) | Kitamura, Kazuyuki (Nippon Steel and Sumitomo Metal Corporation) | Amaya, Hisashi (Nippon Steel and Sumitomo Metal Corporation)
Oil country tubular goods (OCTG) casing and liner wear is a critical problem in today’s drilling environments. To put in place practical countermeasures, it is important to understand its mechanism. This paper presents tribological and electrochemical experiments by use of various OCTG casing materials and environmental liquids, along with the in-situ observation and analysis of the rubbing interface. The results revealed that corrosion-resistant alloys (CRAs) showed an adhesive wear mechanism with relatively high wear rates, whereas low-alloy steels showed an abrasive or a corrosive wear mechanism with mild wear rates. The wear rate had a clear correlation with corrosiveness, where the wear rate increased as corrosion current densities decreased. In-situ observation exhibited that corrosion products c-FeOOH or Fe3O4 were generated and simultaneously scraped by sliding in the case of carbon steel, whereas no corrosion products were generated in the case of corrosion-resistant alloys. In conclusion, CRAs tend to have metal-to-metal adhesion (scuffing) with iron-based tool material, resulting in a high wear rate. However, low-alloy-steel casing can avoid adhesion by oxidizing its surface, resulting in a mild wear rate.
Utilization of polycrystalline diamond compact (PDC) bits has led to improved drilling efficiency and bit performance, especially in long laterals in unconventional organic tight rocks. However, although the drilling efficiency has improved, there are challenges related to increased bit wear in certain types of rocks. It is important to identify when the bit wear starts occurring and in what rock type. No definitive methodology has been proved up to date to help operators to understand precisely when a drill bit is starting to wear off and the attempt to drill ahead becomes an invisible NPT and a high risk of tools lost in hole. Drill bit burn and drill bit metamorphism have been an argument of discussion since several decades without any conclusive determination, despite the fact that some experiments over the years have demonstrated a strong connection to this phenomenon. The intense heat generated by drill bit “fatigue” during drilling, in conjunction with drilling fluid, hydrocarbons and non-hydrocarbon gases, is recognized to generate artifacts that are usually not present naturally. Drill bit metamorphism defines the artifact produced by the interaction between the drill-bit, rocks and drilling fluid and is related to gas and rock artifacts. The additional heat produced by part of the mechanical energy applied to the bit being transformed into heat by friction, thermally crack oil-based drilling fluids and produce both hydrocarbons and Rock artifacts. This thermal cracking of hydrocarbons in the drilling fluids generate unsaturated alkanes such as ethylene and propylene.
Real-time elemental analysis from X-Ray fluorescence (XRF) on drill cuttings can help identify the rocks that cause drill bit metamorphism. Cuttings collected while drilling can be quickly measured with XRF and the data can be flashed to the geo-steering team to help identify the lithological heterogeneity.
In this case study we will show how the real-time application of alkenes detection for identifying drill bit metamorphism combined with XRF elemental analysis to identify the abrasive silica rich layer, can be utilized to aid geo-steering and provide early alerts to prevent drill bit wear and if necessary to trip out of hole with BHA to avoid severe damage to the bit, thus avoiding junk in hole. This real-time technology is of primary importance to optimize drilling operation by performing timely Bit trip and saving rig time to significantly reduce invisible NPT.
ABSTRACT: Cutter, the fundamental cutting element of PDC (Polycrystalline Diamond Compact) bit, directly determines the overall performance of PDC bit and thus influences Rate of Penetration (ROP), drilling efficiency and drilling cost. Bit scrapped caused by cutter failure accounts for about 90% of all the failure in field operations. A majority of researches have been conducted on PDC cutter design, which were mainly concentrated on structure itself, such as stress state, self-sharpening and interface bonding strength, while neglecting the closely related factor of the rock-breaking method. This paper presents a novel PDC cutter which is designed based on the bionics theory to illustrate the layered spalling concept of rock-breaking. This paper has paid much attention to studying the numerical simulation of dynamic rock-breaking process under new bionic PDC cutter to illustrate the rock-breaking mechanism. Meanwhile, the corresponding experiments have been performed with a single cutter to validate the theory. The results indicate that: (a) the process of rock-breaking under the new bionic PDC cutter is characterized as “layered spalling” with slight oscillation comparing to the property-“Block Dropping” of conventional cutter. (b) the cutting process of bionic cutter is more stable with minor impact vibration, which contributes to improving the cutter’s life to some extent. The amplitude of cutting force reduces by 42% and the average value of the new bionic PDC cutter is 460N while the conventional PDC cutter is 790N.
PDC bit is the most important tool for oil drilling. A large number of field tests and laboratory studies (Kaitkay and Lei 2005, Gouda et al. 2011) have shown that the cutting teeth are easy to fail, such as cutter chipped and broken, diamond layer peel off and wear (Lin et al. 1992). So the quality of the cutter directly affects the performance and life of the drill bit (Yahiaoui et al. 2013). There were many studies focused on the comprehensive performance of PDC cutter (Martinez et al. 2013). Most of their researches prone to ameliorate the stress condition of PDC cutter, improve cutters’ self-sharpening performance and enhance the interface bonding strength, etc.(Tammineni et al. 2013, Azar et al. 2013, Raghav et al. 2013). Meanwhile, through many new design methods have been proposed and some good effects were obtained (Zhang et al. 2013), but the problem still exists. The designs merely pay attention to the design of the cutter itself, ignoring the impact of rock breaking modes on the cutters’ performance (Gerbaud 2011).