The term "safety" has become widespread, but since it is a big idea, it may result hard to grasp. In managing safety, the focus is often on accidents and incidents, however, serious events are infrequent and with them the occasions to learn. Learning focuses on everyday events can take place continuously rather than being a reaction to a single serious event. The aim has been to raise awareness of all our people and to provide them with tangible measurements of safety in absence of incidents. In this work it has been detailed a method to quantify monetary returns associated to the continuous identification, analysis and evaluation of risk and relevant controls, based on the fact that management is traditionally focused on financial goals and therefore more inclined to understand such sort of a language. Thus, it has been considered what a return on investment looks like in safety for transport operations. Based on evidences collected from our Business Units, a business case was developed where a disruption occurred in the operating activities due to some issues (i.e.
In this study, we aim to demonstrate how machine learning can empower computational models that can predict the flow rate of a given well. Given current real-time data and periodic well tests, this new method computes flow rates using data-driven model. The computational model is based on analyzing the relations and trends in historical data. Relational databases include huge amounts of data that have been accumulated throughout decades. In addition, there is a large number of incoming operational data points every second that gives a lot of insight about the current status, performance, and health of many wells. The project aims to utilize this data to predict the flow rate of a given well.
A variety of well attributes serve as inputs to the computational models that find the current flow rate. Artificial Neural Networks (ANN) were used in order to build these computational models. In addition, a grid search algorithm was used to fine-tune the parameters for the ANN for every single well. Building a single unique model for every well yielded the most accurate results. Wells that are data-rich performed better than wells with insufficient data. To further enhance the accuracy of the models, models are retrained after every incoming patch of real-time data. This retraining calibrates the models to constantly represent the true well performance and predict better. In practice, Flow rate prediction is used by production engineers to analyze the performance of a given well and to accelerate the process of well test verification. One of the main challenges in building unique models for every well is fine-tuning the parameters for the artificial neural networks, which can be a computationally intensive task. Parameter fine-tuning hasn't been discussed in previous literature regarding flow rate prediction. Therefore, our unique approach addresses the individuality of every well and builds models accordingly. This high-level of customization addresses the problem of under-fitting in ANN well models.
The petroleum industry has been undergoing transitional changes that were mandated by the high competition in the business. Making decisions and engineering approaches grew from being trial- and-error, educated guesses, well-studied designs, to finally creating advanced and well developed models and simulations. Yet, the development of approaches has been and will always be rapidly ongoing. In order to stay competitive, the approaches must be constantly improving.
This study was performed as a proof-of-concept to assess and prove the practicality of using supervised machine learning (ML) techniques to predict multiphase flow regimes in horizontal pipes. The flow is comprised of air, water, and oil. The input features used were water cut (the percentage of water), gas superficial velocity, and liquid superficial velocity. The predicted output was one of six possible flow regimes. The algorithms assessed in the study were Decision Tree, Random Forest, Logistic Regression, Support Vector Machine (SVM), and Neural Network Multi-Layer Perceptron (MLP). According to the results, the best candidate for the dataset is to use the random forest algorithm with a high accuracy of 90.8% and low training time (0.13 seconds) in case of increasing the size of the data and features.
All the predictive algorithms can be easily improved in accuracy by increasing experience, either increasing the size of the dataset or addition of relevant features. The trained model and logic can be applied in industry by automating flow control or installing prediction and mitigation systems in pipelines and field operations.
Core & Log Neural Network Modeling (CLONNE) has been initiated to utilize an ANN to optimize usage of available data to generate synthetic logs and core data which enable user to eliminate any special logs and core data acquisition in the future. This will reduce the well cost and time required for data acquisition and data analysis.
CLONNE process starts with data gathering of the available core and log data which then QC'ed and conditioned for bad hole, light hydrocarbon, thin lamination and normalized. Then pair of core and log data are combined as dummy well to generate the first CLONNE model that can be used to predict for the whole fields. Conventional data including density, neutron, sonic, GR logs and other parameters are used to generate output. A random well from the field is selected to test the predictability matching of CLONNE versus the real data acquired. Several calibration performed to provide the best predictability.
Currently a number of CLONNE models have been created for offshore fields in Malaysia. For CLONNE Synthetic logs, 4 models have been created to predict Porosity, Bulk Density, Neutron and Shear Slowness. For CLONNE Synthetic core, 3 models have been created to predict Grain Size, Permeability and Porosity. All of this models have managed to predict quite well in both thick sand and laminated sand. More models will come to predict other log curves and core parameters. The models established has been tested in one field, where a synthetic sonic log has been created. After the drilling and subsequent logging run, an actual sonic log has been deployed and compared which yield to 96% comparable. The data predicted from CLONNE can greatly save almost 15 months spend to acquire and analyze core data and also almost RM 6 Million total expenditure to acquire and analyze core data.
In 2018, CLONNE has achieved RM 6 Million cost avoidance from application in 3 fields in Malaysia. The CLONNE model generated can be implement to Basin wide prediction thus enable the sharing use of data. This will help to integrate the data available instead of data being utilize in the specific field only.
Noei, Emad Ghaleh (Dept. of Geomatics Engineering University of Calgary, Canada) | Dettmer, Jan (Dept. of Geoscience, University of Calgary, Canada) | Ali, Mohammed (Dept. of Earth Sciences, Khalifa University, UAE) | Lee, Gyoo Ho (Korea Gas Corporation, Korea) | Kim, Jeong Woo (Dept. of Geomatics Engineering University of Calgary, Canada)
This work investigates nonlinear inversion of gravity data to infer Infracambrian Hormuz salt structures offshore Abu Dhabi, UAE. A Bayesian approach with a trans-dimensional parametrization of the subsurface is applied that does not require regularization, resulting in more objective inversion results. The trans-dimensional parametrizations discretize the subsurface structure including the salt dome by an irregular grid of Voronoi cells. Both the number of cells and the cell coordinates are unknown parameters estimated from gravity data. The density contrast of the salt structures is assumed as known. The solution in Bayesian inversion is given by a large ensemble of parameter sets. Here, the trans-dimensional ensemble is obtained with the reversible-jump Markov chain Monte Carlo (rjMCMC) algorithm. Residual errors are parametrized by a full covariance matrix, which is estimated and updated as part of an iterative inversion scheme. Efficient rjMCMC sampling is achieved with parallel tempering. Inversion of airborne gravity anomalies illustrates well-defined Infracambrian Hormuz salt structures offshore Abu Dhabi, where the irregular grid spatially adapts to the data information and without the need to impose explicit regularization or fixed grids. Uncertainty estimates highlight salt dome extent. This study provides new insight into the existence and shape of oil reservoirs associated with the underlying salt structures.
Perforation is the commonly used well completion technology to establishing communication between the wellbore and hydrocarbon-bearing formations, this is critical for optimal production. The proposed new well completion in perforation technology utilizes high power lasers energy. The technology attracted the oil and gas industry due to its unique properties such as it is a non-explosive based perforation technology with precise control over the perforated tunnel's shape, size, and geometry. The laser energy also creates permeability in the perforated tunnel. The successful research effort led to field deployment strategy in the next two years. he laser completion system consists of the laser source to generate the energy, which mounted on a coiled tubing unit on the surface, fiber optics cable to transmit the energy to the downhole target and to the laser perforation tool. The principle of the tool is based on combining optical and mechanical components, the alignment of the optics inside the tool provides the ability to control and generate different beam shapes and sizes, these create the several perforated shots in any direction.
comparison made between the shaped charges perforation and the laser perforation, under the same conditions and same rock type. Pre and post-perforation measurements were conducted for comparisons. The tunnel that is perforated by the high power laser is clean with no compaction, deformation or melt, the result also shows the same in all rock types with improvement in the permeability along the perforated tunnels. Details analysis carried out using different methods such as imaging and thermal analysis.
High-power lasers provide controllable heat source while penetrating the formation that enhances flow properties especially in tight formation, continues efforts over the past two decades proven that laser perforation can create large perforation tunnels with a large diameter. Laser perforation applied on all types for rocks including unconventional tight sands.
The potential of the technology is that laser can be the next new smart perforation generation that will change current well perforation. The state-of-the-art high power lasers technology in downhole perforation provide an innovative and safe non-explosive technology. Precision in controlling the power, orientation, and the shape of the perforation beam are some of the properties of the technology that made it attractive for downhole applications.
The Spraberry trend area is part of a larger oil-producing region within the Midland basin in United States. The main targets, Spraberry, Dean and Wolfcamp, are reservoirs of shales interbedded with clastic formations. Thus, the reservoirs exhibit TIV (transverse isotropy vertical) anisotropy due to thin laminations. A pilot well was drilled vertically in the complex lithology and logged with the advanced acoustics measurements. Shallow penetration of Stoneley energy into the formation raised concerns about the depth resolution of the inverted shear slowness derived from it. It is very difficult to get a reliable horizontal shear slowness from Stoneley when the borehole condition is rugose, there is a complex mud rheology and gas influx inside the borehole.
A machine learning based approach integrating the advanced acoustics measurements and petrophysical interpretation is adopted to provide the solution to get the lithology-based horizontal shear slowness. To eliminate the variability of getting the horizontal shear slowness from Stoneley wave, to process for an advanced geomechanics product like for TIV anisotropy analysis, two machine learning algorithms are used. First one is a very commonly used linear supervised learning algorithm multi-linear regression (MLR) and second is random forest (RF) a nonlinear supervised learning algorithm. These algorithms take inputs from formation evaluation and advanced acoustics to predict the horizontal shear slowness. The random forest algorithm being an ensemble learning method have greater predictive capabilities compared with any linear supervised learning models and many of the non-linear supervised learning algorithms. The inputs for RF and MLR regressions are values of dry weight fractions of calcite, dolomite, quartz, illite, total porosity, permeability, gamma ray, compressional slowness and fast shear slowness. These values are obtained for the entire depth of interest from advance logging tools and interpretation techniques. To check the performance of the model, standard machine learning techniques such as the error evaluation metrics of the mean squared error and the coefficient of determination (
Hyperparameter tuning of the RF model has been done to improve upon the prediction accuracy. After the parameters are tuned, the mean squared error and
Shaik, Abdul Ravoof (ADNOC, UAE) | AlAmeri, Waleed (Khalifa University, UAE) | AlSumaiti, Ali (ADNOC, UAE) | Muhammad, Mushtaq (Khalifa University, UAE) | Thomas, Nithin Chacko (Khalifa University, UAE)
Polymer flooding is one of the most successful methods of EOR and polymers are extensively used in EOR applications. A large number of factors affect the viscosity and rheological properties of polymers. These include but not limited to the effects of salinity, temperature and shear rate on the viscosity of polymer solutions and an exhaustive evaluation is generally required to evaluate the effects of these parameters on polymer rheology under various conditions. However, extensive manpower is required to run such experiments could hinder their applicability. Previously, several efforts have been made to develop a mathematical relationship for polymer rheology, which are limited only to a few parameters. This have prompted a pressing need for new physcis based data driven concepts and tools that can embrace the patterns embodied in these rich polymer bulk rheology data sets. In this study, an extreme gradient boosting based machine learning model is developed to predict responses to changes in monovalent and divalent ions on polymer rheology, which will provide an alternative approach.
In this works, an extensive series of polymer rheology experiments are performed and the effect of concentration, temperature, shear rate and salinity on polymer viscosity are observed. The experimental data has been divided randomly in two different sets. The first set of experimental data has been used to develop ascalable end-to-end tree boosting system based machine learning model. The efficiency of the machine learning model in predicting the polymer rheology has been validated with the second set of experimental data. The machine learning model is later used to investigate the effect of individual ions temperature, brine hardness on polymer rheology.
The accuracy of the machine learning model is measured using root-mean-square error (RMSE), relative error (RE), and coefficient of determination (R2). A blind test of comparing predicted results with second set of experimental data have shown that the developed predictive model is capable of predicting polymer rheology with a maximum error of 2 %, which is within the range of experimental artifact. Based on the present work it has been observed that brine hardness, with, divalent cations such as, Ca+2 and Mg+2 can significantly affect the polymer rheology. This observation can be attributed to the charge shielding effects. The HPAM based polymers and other anionic polymer have negative charges on their backbone. The long chains of polymers stretch straight in solutions due to mutual repulsion of charges, and this straight formation leads to the viscosity of polymers in brine. However, in the presence of divalent cations, these chains become curled up in ball shape formation due to the distribution of positive charges and hence, polymer losses its viscosity in solutions. This phenomenon is termed as charge shielding effect. In addition to this, the machine learning model is used to study brine hardness at varying temperature, concentration, salinity and shear rate. Finally, the machine learning model is used to predict the required polymer concentration for a given high temperature and high salinity reservoirs. Application of the proposed model will provide a clear pathway to screen polymer among wide range of reservoir conditions in order to find most suitable polymer to maximize oil recovery.
Khan, Riaz (ADNOC) | Al Hanaee, Ahmed (ADNOC) | Al Tameemi, Kate (ADNOC) | Kurniawan, Redy (ADNOC) | Omonigho, Neil (ADNOC) | Gueddoud, Abdelghani (ADNOC) | Abdelaal, Atef (ADNOC) | Vantala, Aurifullah (ADNOC)
The Gachsaran Formation across Onshore Abu Dhabi and possibly across U.A.E poses high potential of generating Shallow Biogenic Gas (mainly methane) and as such has taken the attention to further investigate, understand and evaluate its capability for promising Gas Resources. The paper provides a detailed G&G analysis that has potentially allowed an appropriate characterization of this unique formation that has first time uncovered interesting data responses in differentiating the sweet spot.
For the first time in the history of U.A.E., new data was acquired targeting specifically the Miocene, Gachsaran Formation. This includes; 2D Seismic and party 3D Seismic interpretations, thousands of feet continuous core, conventional and advanced subsurface and surface loggings, Formation Pressure, Fluid sampling, Geochemical and Geomechanical labs measurements, stimulations and Frac tests data. The Gachsaran Formation is very challenging due to complex, thinly bedded and intercalated lithological varitions, and tightness provides difficulties in identifying the promising areas of Gas bearing layers. A comprehensive analysis was performed, in the light of regional understanding, by integrating the results of all available data in the form of correlation, cluster analysis, cross plotting and well based rock physics to differentiate the effect of Gas existence within the formation. The potential zones were further tested and results were integrated to confirm the analysis.
The Gachsaran Formation has been subdivided into Lower, Middle and Upper Gachsaran Members. The Lower Member is predominantly evaporitic, becoming more argillaceous carbonate and shale –bearing in the the Middle Member with comparatively less anhydrites. The Upper Member contains mainly anhydrites with interbedded shales and carbonates. The potential sequences which represent high Total Organic Carbon and Gas Shows are found within the Middle Gachsaran.
Consequently, the Middle Gachsaran Member was analyzed based on the robust data acquisition performed. Several relationships among GR, TOC, Gas shows, Lithology, RHOB, NPHI, Sonic, AI, Vp/Vs, Gradient Impedance and XRD Clay mineralogy have been attempted to check possible identification of Gas existence effect on the data. This has led to identify the sweet spots caused by the existence of any dominant Gas within the Study Area. The potential zones were confirmed by well testing. Furthermore, data variables were distributed within a 3D Grid and based on the analysis performed the area of sweet spots were identified. In the next phase of the study, the results will be integrated with the upcoming Geophysical Seismic Inversion studies to further optimize the possibility of identifying the sweet spot across the Study Area.
The robust data acquisition targeting Gachsaran was performed first time in the history of U.A.E. The results are encouraging in establishing the relationship to identify the dominant existence of Gas effects within the area. The estimation of realistic Gas In-Place and its confirmation of commercial discovery will open a new era of Shallow Gas resources within U.A.E.
A digital twin is an enhanced digital representation of a real system. Digital twins are able to mimic the operation of physical systems and use data captured from their sensors to detect abnormal conditions and diagnose the cause of the problem. This paper discusses digital twin concept in general and describes the process of developing a digital twin for electric submersible pump (ESP) systems. It includes a description of the different subsystems interacting with the ESP equipment and the physics governing dynamics of each subsystem and the implementation of the digital versions. The ESP digital twin considered in this paper encompasses the ESP lifted well and includes digital versions of the ESP string, well completion, well fluids, and near wellbore reservoir, among others. The document also describes the implementation of the physics models into a simulator capable to represent diverse operating conditions of the system, including failures and transients.