|Theme||Visible||Selectable||Appearance||Zoom Range (now: 0)|
Session 1: What Does Your Work Look Like in 2030? There is little doubt that digital transformation is going to change the way we do work in 2030. Transformation is expected to add billions of dollars in value, from data to modeling to contracting to health, safety, and environment (HES). Will we finally have data that is perfect and available at our fingertips? Will machines build the reservoir model, leaving petrotech experts to choose and quality control the best models?
SPE, through its Energy4me programme, will present a free one-day energy education workshop for science teachers (grades 8–12). A variety of free instructional materials will be available to take back to the classroom. Educators will receive comprehensive, objective information about the scientific concepts of energy and its importance while discovering the world of oil and natural gas exploration and production. Energy4me is an energy educational public outreach programme that highlights how energy works in our everyday lives and promote information about career opportunities in petroleum engineering and the upstream professions. SPE’s Energy4me programme values the role teachers and energy professionals play in educating young people about the importance of energy.
E&P operators are aggressively looking to increase production with operational efficiency gains. Proactive well and field production management requires digital enablement of operations, with no data silos and data flowing seamlessly from the subsurface to the hands of the operator. With huge amounts of data being collected, it is imperative to apply data-driven techniques to gain more insights that can be utilized to better manage production. A data-driven approach can provide huge benefits for organisations holding vast amount of reservoir, production, and facilities data. It could provide insights into non-linear multidimensional relationships between parameters so that the field development is better understood and optimized. It could allow companies using a proactive approach towards field operations and equipment maintenance resulting in additional cost savings.
This paper presents various case studies where operators improved operational efficiency and optimize production utilizing edge-driven Industrial Internet of Things (IIoT) solutions. These edge IIoT solutions enable fast-loop control through a combination of physics and data-driven workflows, which empowers the operator to proactively manage their assets and focus attention on potentially problematic wells. The solution's architectural setup and ability to deliver fast-loop control workflows at the edge enables operators to successfully detect and manage potential issues such as artificial lift pumps, ultimately improving the performance. Additionally, this approach reduces the dependency upon domain experts to frequently analyze data. The high-frequency data capturing resulted in predicting equipment performance with confidence and allowing remote well management to reduce health, safety and environment (HSE) risks while decreasing logistics and maintenance costs.
New generations face learning in different ways, but managerial styles prevail for decades, creating a gap that greatly affects talent development effectiveness. This creates, more than a gap, a risk, especially for ensuring the outcome of the training of individuals of Generation Z in the oil and gas industry, where safety in front of hazards needs to prevail, and where production optimization is a leading goal for the very sustainability of the industry.
This paper focuses on establishing the main differences in the learning style of the so-called
The urgency of implementing step-changes in the approach that Human Resources and Training departments take for developing Gen-Z is peremptory, most particularly for the change of style in mentoring, recognition and supervisory styles that will have to be used by their leaders.
Esteves Aranha, Pedro (Petrobras) | Abensur Gandelman, Roni (Petrobras) | Henrique Martins dos Santos, Pedro (Petrobras) | Borella Hougaz, Augusto (Intelie) | Gomes Clemente, Ricardo (Intelie) | Dumlao, Vincent (Intelie) | Henrique Santos Teixeira, Pedro (Intelie) | Mazzi, Vitor (Intelie)
The act of planning a well is a collaborative effort taking disparate data from various groups and synthesizing it into one overarching program. Each operator follows its own well design process, but common to each operator is the fact that wells not designed wholly in series, but rather have many parallel sections, with a set of decision gates and countless data interdependencies. The goal of this paper is to detail the successful development and implementation of an integrated planning platform within a National Oil Company.
The individual components of well design process are not performed in a vacuum. The various inputs / outputs from one application affect the inputs/ outputs of applications both upstream and downstream of it, along the design process. This is addressed by facilitating integration and analysis of data input / output from each component of the planning process, synthesizing it, and performing automated system-integrity and overall conformity checks between the interdependent components. Any changes to an individual design component that affect other areas must be flagged, notifying the appropriate parties. Defining and automating workflows, mapping data interdependencies within the workflows, and creating a system of data governance are also keys to building an efficient planning platform. Similar to a navigation program which is able to determine an optimal driving route by considering various dynamic and static data points, the implemented platform uses a data-adaptive approach to well planning. Using this method, it takes unstructured data processes, decision trees, data integration protocols, and automates them while also permitting users to collaborate on well planning and design, allowing the standardization of processes. Operators are able to embed their particular methods of well design into the software platform and thus ensure that all of their company meet the specific requirements. Key to the effectiveness and longevity of this type of platform is a neutral data repository. Allowing data to be free of any singular data protocol ensures that although individual applications may revised, replaced, or put into competition with one another, the format of the data produced as well as ingested by these applications stays constant. Data will remain mapped, both in terms of overall process workflow and interdependencies. This paper is beneficial to any operator wishing to gain insight in developing a forward thinking of a digital strategy for well planning and design. However, the use of these types of methods and development of a similarly integrated platform an operator can standardize processes, enforce and ensure data governance, as well as gain efficiency in overall planning time.
Satti, Rajani (Baker Hughes) | Bale, Derek (Baker Hughes) | Patel, Amar (Baker Hughes) | Nazarenko, Pavel (Baker Hughes) | Avella, Oscar (Baker Hughes) | Solorzano, Pedro (Ecopetrol) | Sanchez, Walter (Ecopetrol) | Giosa, Carlos (Ecopetrol) | Satizabal, Monica (Ecopetrol) | Vega, Sandra (Ecopetrol) | Hernandez, Nini (Ecopetrol) | Coronel, Ivan (Ecopetrol)
Injector well completions are typically carried out using two methods: simple and selective. The performance of injector wells has traditionally been evaluated using spinner-based injection logging tools (ILTs) or on-demand fiber optic distributed temperature and acoustic (DTS or DAS) sensing while logging the well performance every 3 months or once a year depending on the application. However, with such methods utilizing a snap shot approach, critical well performance information or possible anomalies are often missed due to measurements taken only during certain periods. Therefore, operators have explored the use of advanced fiber-optics methods such as permanent distributed temperature sensing (DTS) and permanent distributed acoustic sensing (DAS), that provide continuous, real-time measurements to enable understanding of dynamic well behavior at all times and mitigate any deferred or behavioral problems.
Of relevance to this work is a vertical, injector well in a competent sandstone formation of the Heavy Oil Chichimene field in Llanos Basin, Colombia. As a first step of the fiber-optics monitoring strategy, a careful evaluation of DTS and DAS based fiber-optics methods was conducted. Based on the data analysis and operational history, DAS-based fiber optics monitoring was chosen as the most effective monitoring solution for this well. Subsequently, a proprietary DAS algorithm was developed to analyze the data and estimate the flow allocation for all the four zones. The results include waterfall acoustic energy maps, temporal flow allocation profiles and most importantly, the zonal flow allocation values.
Predicted (DAS) zonal flow allocation data was compared with traditional injection logs (ILT) under different operational conditions (varied injection flow rates and valve choke settings). Based on comparable agreement between DAS and ILT data, the operator decided to replace ILT runs with DAS-based fiber optic monitoring, resulting in lower operational costs while enabling near real-time monitoring, and providing the continuous distributed data essential for the dynamic monitoring of the well. The successful application of fiber-optics monitoring to provide an injection profile in conjunction with a surface-controlled electric valve system demonstrates a significant potential to optimize the injection process in complex injector wells. Further, remotely controlling, monitoring and optimizing injection rates into the multi-segmented zones improves the service life of the injection operations, eliminates future intervention costs, and increases ultimate recovery.
This workflow evaluates reservoir souring risk for a deepwater development, taking into consideration dynamic reservoir characteristic effects over time, and incorporating detailed flow modelling of the system to assess H2S partitioning effects and to determine their design and operational impact. The objectives are: Showcase an integrated multi-domain approach to souring modeling from reservoir to facilities, provide a reservoir souring modeling workflow applicable to new and existing offshore waterflooding projects, and present a case study in which this workflow was successfully implemented at two distinct points of (pre-first oil and after eight years of production)of an existing development
This unique workflow takes a holistic approach to reservoir souring prediction. This process may be implemented either pre-first oil or during field production in order to optimize souring mitigation strategies and operations, such as sulfate removal and nitrate injection. If the evaluation is to be undertaken before production operations, data will be more limited, however, this methodology has proved to still be effective in determining the risk associated with souring for that particular reservoir. All aspects of the system are considered: reservoir fluid characteristics, reservoir static and dynamic models, production system thermo-hydraulic and produced fluids thermodynamic performance, and the critical subsea and topsides design considerations.
The workflow has been proven to simulate the souring process, the growth and respiration effect of SRB leading to H2S formation in the reservoir, as well as the biological reactions related with the injection of nitrate to inhibit these phenomena for a series of conditions using the latest modeling tools available.
The approach helped understand a complex set of bio-chemical reactions involved in the production of H2S and its potential inhibition to avoid reservoir souring in a particular field. A sensitivity analysis using the integrated special reservoir simulation model was able to predict the effectiveness of nitrate injection and has also been used to guide the design phase, while supporting the subsequent optimization of the mitigation strategy supportbased on the evaluation of actual production data.
The comprehensive modeling approach employed considers for variation in biological system characteristics, reservoir conditions, and hydrodynamics of the production system that affect the production of hydrogen sulfide. Production of elevated levels of H2S can generate material integrity issues including corrosion, health and safety issues, and loss of hydrocarbon value, since high H2S contents will reduce the monetary value of produced oil or gas.
The field case study presents souring models calibrated and validated with actual data collected from the production streams during eight years of history and a set of sensitivity scenarios were analyzed to consider uncertainties in various simulation parameters. The results show that the dosage of calcium nitrate injected in this instance successfully controlled SRB growth and that no reservoir souring was detected. However, the application of the developed workflow allowed the operator to optimize the current Calcium Nitrate dosage, enabling a considerable reduction in the injection rate, which represented a key factor for operational costenabling subsequent operating expenditure reductions.
The value added with this workflow, during waterflood management using seawater injection, allows operators to timely and clearly determine reservoir souring behavior and anticipate potential risks associated with it. The findings are also useful for supporting major design and operational decisions, including the development and optimization of souring management and mitigation techniques.
Tokpanov, Yury (Caltech) | Smith, Johanna (Quantum Reservoir Impact LLC) | Ma, Zheren (Quantum Reservoir Impact LLC) | Deng, Lichi (Quantum Reservoir Impact LLC) | Benhallam, Wassim (Quantum Reservoir Impact LLC) | Salehi, Amir (Quantum Reservoir Impact LLC) | Zhai, Xiang (Quantum Reservoir Impact LLC) | Darabi, Hamed (Quantum Reservoir Impact LLC) | Castineira, David (Quantum Reservoir Impact LLC)
Stratigraphic correlation is essential in field evaluation as it provides the necessary tops to compartmentalize the reservoir. It further contributes to other parts of the field development planning cycle such as reservoir modeling, volumetric assessment, production allocation, etc. Traditional approach of manual pairwise correlation is labor-intensive and time-consuming. This research presents a novel automated stratigraphic correlator to create well top and zonation interpretations using supervised machine learning algorithms of Convolutional- and Recurrent-Neural-Networks (CNNs and RNNs).
An automated stratigraphic correlator is created that enables stratigraphic well top and zonation interpretations learned from the well logs of a subset of wells with zonation information manually provided by human experts. The method can efficiently learn the patterns and hidden information from the well logs’ sequential data, implicitly capture the domain expertise, and streamline and automate the traditional manual repetitive work. Our method supersedes existing approaches like Multiple Sequence Alignment (MSA) by incorporating domain expertise through tops/zones picked by geologists. A Bidirectional Long Short-Term Memory (BiLSTM) is used to interpret the log data, since deposition by nature is a sequential process and RNNs can intrinsically capture such series. An Inception autoencoder CNN is also applied in this workflow for stratigraphic interpretation. Reliable post-processing is also included using the predicted zone probability logs to quantify the overall confidence score of well zonation, and to correct misinterpretation when necessary using transition frequencies in log data through a linear chain graphical probabilistic model. The methodology is tested on one of the major Middle East oilfields with around 1,500 wells to prove its efficiency and capability.
The overall methodology involves data pre-processing, deep learning model training and prediction, and the post-processing of model-predicted results. In this specific workflow, the machine learning targets include both the prediction of zones (multi-class classification/segmentation problem) and the prediction of well tops (edge-detection problem). Thus, a supervised multi-task learning on a single field using CNNs and RNNs is implemented to be able to perform different tasks with the same model. The inputs to the training module include trajectory logs and other measured logs such as gamma-ray, resistivity, neutron density, etc. All inputs are normalized to zero mean and unit standard deviation. For wells with missing log values, the approach can either discard it or perform data imputation to reconstruct the data using different automated algorithms. The machine learning engine uses two different algorithms (BiLSTM and Inception autoencoder CNN), with many other deep learning models tested. The training loss function includes zone categorical cross entropy loss, tops edge detection binary cross entropy loss and L2-norm regularization term. The learning rate is dynamically adjusted during training so that it is reduced when the loss is stalled. The post-processing uses the machine learning predicted zone probability logs to select the zoning sequence that maximizes overall zonation probability and treats it as the confidence score of well zonation. This dramatically helps in constraining the outcome stratigraphic interpretation by geological succession and minimizing the correlation error. The entire workflow has been applied to one major Middle East oilfield with a large number of pre-interpreted well logs, with 60% of the wells used to train the deep learning models, 20% used for validation and the rest are for blind test. Both BiLSTM and Inception autoencoder CNN show close to human-level performance in the blind test dataset. The mean absolute error of well tops interpretation after post-processing is around 3 m throughout all analyzed wells, which provided an accuracy of nearly 90% for the blind test dataset. The classification precision and accuracy also demonstrate close-to-human-level performance in the major zones with sufficient data. It has been noticed that for cases without missing data, Inception autoencoder CNN achieves best performance, while BiLSTM benefits a lot from imputation when missing data exists.
The methodology automates and streamlines the originally time-consuming stratigraphic correlation process. It performs better than existing approaches through a well-developed machine learning framework with comprehensive data pre- and post-processing. The resulting stratigraphic correlation proves to be extremely reliable even with a small number of seed wells, and it requires minimal user intervention during the process. Through deep learning techniques such as transfer learning, the proposed methodology can be readily applied to other fields even with limited training data.
Goh, Kim Fah Gordon (Schlumberger Technology Corporation, Houston US) | Biniwale, Shripad (Schlumberger Abingdon Technology Center, UK) | Musayev, Rashid (Schlumberger Technology Corporation, Houston US) | Ahmed Elfeel, Mohamed (Schlumberger Abingdon Technology Center, UK)
Intelligent Completions (IC) are deployed with the high hopes of frequent data utilization and zonal selectivity maneuver to optimize production continuously. The permanent downhole presence of measurements like pressure, temperature, rate, water-cut, gas-break provide downhole indicators and trending analysis of production performance and injection conformance. These are utilized not only to maximize hydrocarbon production but also to reduce surface handling of water and/or gas, improve injection efficiency, and reduce carbon and environmental footprint. However, the reality could be different from the evaluation stage to the application stage. The asset production engineers or the reservoir engineers face real challenges when it comes to design, downhole installation, data transmission, real-time analysis, and optimization to deliver the real value of the initial investment. These suboptimal application factors, multiplied by the complexity of IC deployment and execution with existing hardware constraints, have limited the progression towards digital well technology. By analyzing such trends, a new advanced completion optimization methodology has been devised, leveraging the latest technology and innovation, IC deployment simplification, and electrification efforts in the industry.
This paper analyses the underutilization reasons of digital well technology, such as - the ability of design and implementation, the downhole data measurement, complexity of modeling and optimization, and the bottlenecks in applying the learning from the Intelligent Completions data to optimize production. It is then compared to the easing transition to the future digital-wells, advanced modeling capabilities that are driving the oilfield digitalization by next-generation Intelligent Completion. This digital transition ranges from ease-of-deployment to ease-of-optimization and eventually towards cloud-enabled decision making. The new era of IC electrification deployment and digital solutions are twinning to provide an integrated platform to maximize value and justification for more future digital wells.
A fully digital system to control reservoir and optimize the product is becoming a reality with the transformation of modeling capability and enabled by simplification of IC deployment, and this is the digital future of IC optimization. This digital solution is continuously feeding asset subsurface, modeling, and optimization team with productivity or injectivity indexes and other inputs required for reservoir steady-state and transient evaluation. The IC industry continues to be integrating into the new solution frontiers of logging-while-producing, the testing-while-producing capability to the eventual optimizing, modeling-while-producing future, leading towards a true digital oilfield of the future.
Offset well analysis is the process of investigating and integrating historical drilling performance from neighboring wells into prospect well design. Traditional offset analysis is a time and resource intensive process that requires a lot of manual input and analysis to make various design decisions. Our proposed workflow automates much of this analysis, while allowing the user to customize designs based on operating metrics resulting in a quicker and more comprehensive offset well analysis that provides the user with intelligent offset recommendations and a base design for the prospect well. Historical data is queried from a structured engineering and operation database comprising of data ranging from subsurface geology, drilling to production and end-of-well. We gather this historical data from nearby wells and learn from those in order to produce a first pass design for the prospect well to begin with. With this data, our proposed workflow implements algorithms to identify representative offset wells with similar geology, trajectory and other characteristics. A similarity analysis based on various geological sequences across the wells is conducted by a deep neural network algorithm, which is trained to analyze sequential patterns within. Trajectory similarity is performed using well surveys (dogleg severity, inclination, and azimuth). A recurrent neural network is employed to learn the well survey patterns and classify wells with similar trajectories. Finally, drilling performance metrics are used to rank the offsets and aid selection of the best offset design to be used as a base template for the prospect well. The proposed workflow significantly reduces analysis time; the software analysis time is less than five minutes. The user can almost instantaneously query the database and obtain wells that are similar in terms of trajectory and geology, and rank those wells on certain pre-defined metrics. The engineer is also able to view key events and hazards history for the offset wells. This provides the engineer with a compilation of hazards and risks by depth and cause, allowing the engineer to focus on mitigating risks and designing better and safer wells. The engineer can accept the top ranked offset and automatically select casing design based on default metrics of cost, time, and NPT, or can implement other metrics and create a composite casing design from different hole-sections from different offsets.