Reservoirs which produce under active water drive offer a significant uncertainty towards implementation of Chemical EOR processes. This paper describes a successful pilot testing of ASP process in a clastic reservoir which is operating under strong aquifer drive. The field has ~ 30 years of production history. The objective of the pilot was to understand response of ASP process in a mature reservoir, which is operating under active edge water drive. The build-up permeability of the reservoir is 2-8 Darcy with viscosity~ 50 cP. Salient key observations like production performance, incremental oil gain, polymer breakthrough etc. are discussed in this paper after completion of the pilot.
On the basis of laboratory study and simulation, ASP pilot was implemented in the field in 2010.The pilot was designed with single inverted five spot pattern and one observation well. The pilot envisaged injection of 0.3 pore volume (PV) Alkali-Surfactant-Polymer (ASP) slug, 0.3 PV graded polymer buffer followed by 0.4PV chase water. The pilot was meticulously monitored for production performance and breakthrough of chemicals. All the pilot producers have more than 20 years of production history. Base oil rate and water cut were fixed before start of the pilot, on the basis of test data which was used to monitor pilot performance. Interwell Tracer Test (IWTT) was conducted before starting of ASP injection so as to understand sweep in the pilot area. In addition, quality of injection water and chemical concentration in ASP slug was checked regularly to ensure best quality.
Significant response of the pilot was observed within 15 months of the start of the pilot which was published in 2012. This paper aims to describe the learning and conclusion after successful completion of the pilot. ~40-50% jump in oil rate was observed during the ASP injection period which sustained for 12-18 months. However preferential breakthrough of ASP slug in one of the producer impacted the incremental oil gain. The preferential breakthrough of polymer was due to presence of high permeability streaks which was rectified by profile modification job. In addition, strong aquifer movement was experienced during ASP injection which leads to rise in water cut of a pilot well. However, the pilot well was restored through water shutoff jobs. After completion of ASP and mobility buffer, a cumulative incremental oil ~28000 m3 was obtained. Cumulative incremental oil gain is in line with simulation studies prediction. 12-14% decrease in water cut was observed which sustained for ~ 6-18 months. Regular monitoring of produced fluid indicated breakthrough of polymer and alkali in 2-3 producers. During the pilot, produced fluid handling issues like tough emulsion formation, lift malfunctioning etc. was not observed. These collective observation indicated success of the ASP pilot project.
There are very few case histories of successful ASP pilot implementation are available, in which the reservoirs has been operating under active aquifer drive. Learning of this ASP project can be taken forward for expansion of ASP flood and also designing of ASP pilot/commercial projects for analogous reservoirs.
Temizel, Cenk (Aera Energy) | Balaji, Karthik (University of North Dakota) | Canbaz, Celal Hakan (Ege University) | Palabiyik, Yildiray (Istanbul Technical University) | Moreno, Raul (Smart Recovery) | Rabiei, Minou (University of North Dakota) | Zhou, Zifu (University of North Dakota) | Ranjith, Rahul (Far Technologies)
Due to complex characteristics of shale reservoirs, data-driven techniques offer fast and practical solutions in optimization and better management of shale assets. Developments in data-driven techniques enable robust analysis of not only the primary depletion mechanisms, but also the enhanced oil recovery in unconventionals such as natural gas injection. This study provides a comprehensive background on application of data-driven methods in oil and gas industry, the process, methodology and learnings along with examples of data-driven analysis of natural gas injection in shale oil reservoirs through the use of publicly-available data.
Data is obtained and organized. Patterns in production data are analyzed using data-driven methods to understand key parameters in the recovery process as well as the optimum operational strategies to improve recovery. The complete process is illustrated step-by-step for clarity and to serve as a practical guide for readers. This study also provides information on what other alternative physics-based evaluation methods will be able to offer in the current conditions of data availability and the understanding of physics of recovery in shale oil assets together with the comparison of outcomes of those methods with respect to the data-driven methods. Thereby, a thorough comparison of physics-based and data-driven methods, their advantages, drawbacks and challenges are provided.
It has been observed that data organization and filtering takes significant time before application of the actual data-driven method, yet data-driven methods serve as a practical solution in fields that are mature enough to bear data for analysis as long as the methodology is carefully applied. The advantages, challenges and associated risks of using data-driven methods are also included. The results of comparison between physics-based methods and data-driven methods illustrate the advantages and disadvantages of each method while providing the differences in evaluation and outcome along with a guideline for when to use what kind of strategy and evaluation in an asset.
A comprehensive understanding of the interactions between key components of the formation and the way various elements of an EOR process impact these interactions, is of paramount importance. Among the few existing studies on natural gas injection in shale oil with the use of data-driven methods in oil and gas industry include a comparative approach including the physics-based methods but lack the interrelationship between physics-based and data-driven methods as a complementary and a competitor within the era of rise of unconventionals. This study closes the gap and serves as an up-to-date reference for industry professionals.
Alkinani, Husam H. (Missouri University of Science and Technology) | Al-Hameedi, Abo Taleb T. (Missouri University of Science and Technology) | Dunn-Norman, Shari (Missouri University of Science and Technology) | Alkhamis, Mohammed M. (Missouri University of Science and Technology) | Mutar, Rusul A. (Ministry of Communications and Technology)
Lost circulation is a complicated problem to be predicted with conventional statistical tools. As the drilling environment is getting more complicated nowadays, more advanced techniques such as artificial neural networks (ANNs) are required to help to estimate mud losses prior to drilling. The aim of this work is to estimate mud losses for induced fractures formations prior to drilling to assist the drilling personnel in preparing remedies for this problem prior to entering the losses zone. Once the severity of losses is known, the key drilling parameters can be adjusted to avoid or at least mitigate losses as a proactive approach.
Lost circulation data were extracted from over 1500 wells drilled worldwide. The data were divided into three sets; training, validation, and testing datasets. 60% of the data are used for training, 20% for validation, and 20% for testing. Any ANN consists of the following layers, the input layer, hidden layer(s), and the output layer. A determination of the optimum number of hidden layers and the number of neurons in each hidden layer is required to have the best estimation, this is done using the mean square of error (MSE). A supervised ANNs was created for induced fractures formations. A decision was made to have one hidden layer in the network with ten neurons in the hidden layer. Since there are many training algorithms to choose from, it was necessary to choose the best algorithm for this specific data set. Ten different training algorithms were tested, the Levenberg-Marquardt (LM) algorithm was chosen since it gave the lowest MSE and it had the highest R-squared. The final results showed that the supervised ANN has the ability to predict lost circulation with an overall R-squared of 0.925 for induced fractures formations. This is a very good estimation that will help the drilling personnel prepare remedies before entering the losses zone as well as adjusting the key drilling parameters to avoid or at least mitigate losses as a proactive approach. This ANN can be used globally for any induced fractures formations that are suffering from the lost circulation problem to estimate mud losses.
As the demand for energy increases, the drilling process is becoming more challenging. Thus, more advanced tools such as ANNs are required to better tackle these problems. The ANN built in this paper can be adapted to commercial software that predicts lost circulation for any induced fractures formations globally.
Saluja, Vikas (Oil & Natural Gas Corporation LTD.) | Singh, Uday (Oil & Natural Gas Corporation LTD.) | Ghosh, Aninda (Oil & Natural Gas Corporation LTD.) | Prakash, Puja (Oil & Natural Gas Corporation LTD.) | Kumar, Ravendra (Oil & Natural Gas Corporation LTD.) | Verma, Rajeev (Oil & Natural Gas Corporation LTD.)
The case study demonstrated here is the innovative workflow for fault delineation technique on a 3D seismic volume in B-173A Field of Heera Panna Bassein (HPB) Sector, Western Offshore Basin, India. B-173A is located 50 kms west of Mumbai at an average water depth of about 50 m. The field was discovered in the year 1992 and it was put on production in Aug 1998. In B-173A field there are two hydrocarbon bearing zones one is gas bearing Mukta (Lower Oligocene carbonates) Formation and oil bearing Bassein (Middle to Upper Eocene Carbonates) formation.
The present study is an extended workflow on Advanced Seismic Interpretation using Spectral Decomposition and RGB Blending for Fault delineation. Iso-frequency volumes are extracted from Relative Acoustic Impedance data instead of seismic data itself.
The workflow is for effective fault delineation and it consists of Spectral Decomposition of relative acoustic impedance data and RGB Blending of discontinuity attributes of different Iso-frequency volumes.
It is observed that RGB blend volume of discontinuity attributes provided more convincing results for fault delineation as compared to the results of traditional discontinuity attributes.
In recent years, the oil and gas industry has gained greater operational efficiencies and productivity by deploying advanced technologies, such as smart sensors, data analytics, artificial intelligence and machine learning — all linked via Internet of Things connectivity. This transformation is profound, but just starting. Leading offshore E&P operators envision using such applications to help drive their production costs to as low as $7 per barrel or less. A large North Sea operator among them successfully deployed a low-manned platform in the Ivar Aasen field in December 2016, operating it via redundant control rooms — one on the platform, the other onshore 1,000 kilometers away in Trondheim, Norway. In January 2019, the offshore control room operators handed over the platform's control to the onshore operators, and it is now managed exclusively from the onshore one. One particular application — remote condition monitoring of equipment — supports a proactive, more predictive condition-based maintenance program, which is helping to ensure equipment availability, maximize utilization, and find ways to improve performance. This paper will explain the use case in greater detail, including insights into how artificial intelligence and machine learning are incorporated into this operational model. Also described will be the application of a closed-loop lifecycle platform management model, using the concepts of digital twins from pre-FEED and FEED phases through construction, commissioning, and an expected lifecycle spanning 20 years of operations. It is derived from an update to a paper presented at the 2018 SPE Offshore Technology Conference (OTC) that introduced the use case in its 2017-18 operating model, but that was before the debut of the platform's exclusive monitoring of its operations by its onshore control room.
Computer vision (CV) techniques were applied to X-ray computed tomographic (CT) images of reservoir cores to evaluate their potential for rapidly identifying fractures and other lithologic/geologic characteristics. The analyses utilized feature-labeled CT cross sectional images, themselves distributed submillimeter voxels of density- and atomic number-sensitive CT Numbers, as inputs to a Fully Convolutional Neural Network (FCN) for semantic segmentation of reservoir core features. In FCN, an image is interrogated using a series of sliding windows at various scales to create weighted filters to reduce error between classes in training images. These networks of filter layers were used to assign probabilities of classes, which were upscaled back to the original image dimensions resulting in probabilistic class assignments onto each pixel. FCN model accuracy, defined by its ability to replicate manually-assigned labels in the raw (unannotated) training image stack, was at least 80% and generally improved with the size of the training set. Once the labels were assigned, the underlying feature frequency, orientation, and size were measured in 3D volume reconstructions using algorithms modified from standard image analysis software. This method allowed users to endow a classification model with subject matter knowledge for further, autonomous label prediction. Thus, while initial image annotation was labor intensive, subsequent images were rapidly classified once the model was built. The classified labels were analyzed for abundance, orientation, and size of fractures were calculated to characterize spatial information of these features. FCN combined with fracture labeling improved knowledge capture and automation of fracture identification. Models trained by high quality 3D datasets can greatly reduce the time needed to describe subsequent core. The method demonstrated is not limited to fractures, other lithologic/geologic features could be trained using the same method, which may result in additional efficiencies.
Al-Hameedi, Abo Taleb T. (Missouri University of Science and Technology) | Alkinani, Husam H. (Missouri University of Science and Technology) | Dunn-Norman, Shari (Missouri University of Science and Technology) | Amer, Ahmed S. (Newpark Technology Center/ Newpark Drilling Fluids)
Equivalent circulation density (ECD) management is a key factor for the successfulness of the drilling operations, especially when dealing with narrow mud-weight windows. Poor management of ECD can result in unsafe and/or inefficient drilling as well as an increase in drilling cost due to associated nonproductive time (NPT). Different parameters can affect the ECD directly or indirectly including, but not limited to, wellbore geometry, cuttings, hole cleaning efficiency, flow rate, and rheological properties of the drilling fluid. However, the magnitude of the effect of each parameter is not well understood. In this paper, a comprehensive statistical analysis using the correlation coefficient was conducted using real field data to investigate the effect of three controllable factors - solid contents (SC), yield point (Yp), and plastic viscosity (PV) - on ECD.
Vertical Interference tests (VIT) are used to determine the hydraulic connectivity between the formation sand intervals. This paper showcases an innovative workflow of using the petrophysical log attributes to characterize a heterogeneous reservoir sand by making use of ANN (Artificial Neural Net) and SMLP (Stratigraphic Modified Lorentz) based rock typing techniques as well as image based advanced sand layer computation techniques.
Vertical interference test is either performed using a wireline formation testing tool with multiple flow probes deployed in a vertical sequence at desired depth points on the borehole wall or using a drill stem test configuration. Based on the test design, flow rates are changed using downhole pumps, which induces pressure transients in the formation. The measured pressure response is then compared with a numerical model to derive the reservoir parameters such as vertical permeability, hydraulic connectivity etc. The conventional way of model generation is to consider a section of reservoir sand as homogenous, which generally leads to over estimation or underestimation of vertical permeabilities. The technique proposed in this paper utilizes advanced logs such as image logs; magnetic resonance logs, water saturation and other advanced lithology logs to obey heterogeneity in the reservoir model by utilizing ANN/SMLP based rock-typing techniques. These rock types would be helpful in making a multi layer formation model for the VIT modeling and regression approach. The vertical interference test model is then used to determine the vertical permeability values for each of the individual rock types. The paper displays the workflow to utilize the rock type based layered formation model in vertical interference test modeling for a channel sand scenario.
This paper presents a data driven approach to answer the question of whether premium, high strength white sand proppant, while more expensive than regional (brown) sand, is justified due to its alleged ability to make better producing wells. For this study, 739 horizontal wells with production, and stimulation data were used in a robust statistical approach to conclude that, for the most common set of well characteristics, white sand will produce a superior NPV weighted economic outcome than lower cost regional (brown) sand alternatives. While there are wells in this analysis that did not produce this robust conclusion of "white sand is better", none of them produced an outcome that "brown sand was better". Rather, several of the wells simply had results that were statistically inconclusive. This paper serves as a good example of what data are needed to perform such an analysis and the challenges of normalizing'first order effects' that dominate the influence on well productivity (TVD, lateral length, and proppant intensity) while attempting to ascertain the influence of'second order' factors such as Sand Type. Becoming familiar and adept at these analysis methods should facilitate the statistical verification of other second order effects on finding the optimal stimulation treatment.
This paper discusses about the various approaches that has been taken to reduce downtime in Coal bed methane (CBM) wells by predicting the string integrity failure in advance. Various approaches such as the Principal component analysis (PCA) based T-statistics approach and the Bag of Features approach have been taken to find a solution. These approaches fall under the classical classification approach of supervised learning. The power and usefulness of these approaches are fuelled and limited by the number and richness of annotations and sensors at the Well site. These approaches used on the CBM wells gave very encouraging results, thereby proving the helpfulness of this approach in enhancing the well efficiency and decreasing the well downtime by planning based on the failure prediction of the model.