Ahmed, Syed (Saudi Arabian Oil Company, Saudi Aramco) | Al-Zubail, Ahmad (Saudi Arabian Oil Company, Saudi Aramco) | Al-Jeshi, Majed (Saudi Arabian Oil Company, Saudi Aramco) | Yousef, Khaled (Saudi Arabian Oil Company, Saudi Aramco) | Musabbeh, Alya (Saudi Arabian Oil Company, Saudi Aramco) | Mousa, Saad (Saudi Arabian Oil Company, Saudi Aramco) | Bukhari, Adeeb (Saudi Arabian Oil Company, Saudi Aramco) | Seraihi, Emad (Saudi Arabian Oil Company, Saudi Aramco) | Alamri, Sultan H. (Saudi Arabian Oil Company, Saudi Aramco)
This paper describes integrated solution that leverages Industrial Revolution 4.0 to sustain crude quality specifications for Saudi Aramco supply chain covering more than 50 GOSPs (Gas Oil Separation Plants), Pipelines, and Terminals. Sustaining crude quality specifications such water content (BS&W), salt content, etc. for the Arabian Crudes (Arab light, Arab Extra Light etc.) requires big data analysis across the supply chain. To address this challenge, Saudi Aramco developed a customized solution called Crude Quality Monitoring Solution (CQMS) which leverages 800 critical data streams every minute (PI values), classifies the data according to the risk level impacting crude quality specifications. Three developed risk levels are leading proactive, lagging proactive, and lagging reactive, each of which has a defined acceptable risk matrix. Each risk matrix initiates automated notifications for corrective actions proactively. Moreover, patterns and operational events can be easily recognized in the solution visually. The paper also describes several examples where the solution notifications have proactively remediated process disturbances by up to 20% at upstream and downstream facilities while ensuring asset integrity. The solution deployment has also substantially improved the operational efficiency across the network by benchmarking critical data streams. Saudi Aramco is continuing to enhance the solution capabilities to ensure maximization of the crude network.
Jain, Siddharth (Sharjah National Oil Corporation) | Al Hamadi, Masoud (Sharjah National Oil Corporation) | Stewart, Neil (Sharjah National Oil Corporation) | Khan, Sikandar (LMKR) | Malik, Arshad (LMKR)
Capitalizing on untapped potentials and quantifying risk is the key to success in a turbulent commodity industry. With an internal initiative to capture national knowledge and to safe guard sovereign data, Sharjah National Oil Corporation (SNOC) embarked on a multi-year journey to find, archive, digitize and integrate the entire E&P data in the Emirate dating from 1966 till date. The challenges and accomplishments of effective data management are demonstrated as a successful case study.
Securing investment in a country’s petroleum industry demands for government agencies to add value to its oil & gas assets by making them readily available, globally. To do so, one of the most important steps was to effectively catalogue and manage technical information from both past as well as present exploration and production initiatives. Scattered archives for multi-disciplinary data including well reports, logs, maps, seismic sections, tapes, G&G studies, cores, cuttings and digital databases was centralized in one repository for the Emirate. A unified data quality assessment, metadata capture & naming convention was established for all data types including a technology infrastructure to enable maximum utilization and integration of data into future projects. Ambitious to exploit its current technical assets by evaluating historic data to attain maximum benefit to SNOC, partners and global investors; SNOC adopted a cohesive and effective physical & digital asset management approach with a phased methodology in early 2016. This data management activity was classified extremely critical to the E&P current and upcoming projects by laying the foundation for drilling SNOC’s 1st Exploration well, a successful Farm-out bid round, quantifying workover opportunities, regional interpretation and performing field development simulations. Centralizing and integrating the data management function enabled the operator to identify data gaps and work on fulfilling them to successfully establish a regional technical data center. The paper discusses the challenges of gathering, extracting and incorporating legacy data; shedding light on the value it generated in paving the way for a digital transformation in SNOC’s E&P business.
Fluctuating technological, political and business influences add to the volatility and risk in the oil and gas business. The framework and success of future opportunities is reliant on the data used to quantify them. The method chosen for this journey was to modularize the data management project and structure best in class industry solutions for each aspect. The architecture for data management implemented is cost-effective; which is scalable and upgradable with minimum effort into a digital hub of information & knowledge management.
Reservoir management guidelines are an enabler of, production sustainability, assurance to reservoir health and high ultimate recovery. Monitoring the compliance of the field production against the set of reservoir management guidelines is one of the key processes for ADNOC, being a governing body of major U.A.E. hydrocarbon producing fields.
With the business need to ramp up production, field maturation, and the associated operational challenges, it is critical for ADNOC to effectively monitor and regulate its field production plans to assure the long-term production sustainability. In this regard, ADNOC has developed a robust framework that is implemented through an automated analytics platform that enables different ADNOC technical teams to effectively monitor and report the compliance status of each hydrocarbon barrel from produced from ADNOC assets. The paper highlights the features of the workflow implemented, the management of change strategy and the business value created.
The automated process allows the consolidation of a variety of well, reservoir and field-level data. The analytical platform enables integrated analysis, KPI calculation and interactive visualization. The framework assesses the compliance based on three governing parameters: well technical rate, gas-oil ratio (GOR), and bottom hole flowing pressure. The compliance analysis is carried out on a monthly basis where the monthly back allocated production data for each well is compared with the set of operating guidelines in an automated data analytics and visualization environment. A pragmatic compliance tolerance is considered in the calculations to accommodate the measurement inaccuracies, as well as the operational limitations while allowing flexibility to exclude nonconformity with valid reasons.
The overall process is governed through an automated business process management (BPM) platform, which seamlessly regulates the predefined subroutines among different stakeholders to report and track different corrective actions in a timely manner.
The framework implementation has strengthened the overall compliance governance process; and has been instrumental to properly manage asset production capacity in a systematic manner. This has subsequently enabled the preparation of a prompt action plan and has improved the operating efficiency of more than 3% within the first six months of implementation, through restoring, compensating and increasing the effective capacity of overall ADNOC Production.
The approach has demonstrated great value both in terms of process alignment, as well as from the production assurance standpoint at a country level, and allows the organization to have an established system, which provides: Consistent compliance monitoring standards Minimal subjectivity Complete process governance Quick turnaround time Auditable history
Consistent compliance monitoring standards
Complete process governance
Quick turnaround time
The aim of this paper is to publish a stepwise guide for any operators who might be interested to adopt and implement a similar approach to assure the long-term production sustainability and health of their assets.
There are various factors that contribute to the Well planning. Be it cost, complexity of the reservoir, safety measures, isolating problem areas, engineering, etc., The whole process of well analysis takes a lot of time and efforts, as the method used is mostly manual. In this abstract, we explain the Big Data approach used to perform Well Planning which significantly reduces the drilling engineer time. Future, possible suggestions to the existing system are also discussed in this paper.
Over the past few years, the concept of Big Data has gradually matured. Drilling Engineers, Data Management and the Solution Development Team have jointly discussed the potential and the functionality if this concept to optimize the Drilling Engineer performance and timeframe for preparing and delivering drilling programs for the new development wells. EDM (Enterprise Drilling Management System), drilling suite of applications, In-House drilling data extraction tool with Web based interface, and many different types of excel sheets, were considered as the input sources. Well planning process was automated for offset and related data acquisition, results and knowledge were captured in a central repository, which will be used in future endaovers.
In this paper we will set out how we maximise the value created by the digital revolution through the use Systems Thinking and Agile techniques to establish a FEL 0-1 Digital Twin, we will then describe how we use a BIM approach to evolve this Digital Twin through the project lifecycle; fostering collaboration, breaking down siloes, creating and protecting value as we do so. Two case studies, one an offshore gas compression project and the other a normally unmanned wellhead installation, will be presented to demonstrate the application and effectiveness of this approach.
Meda, Marco (ENI) | Martinelli, Mattia (Università degli Studi diMilano - Bicocca) | Bistacchi, Andrea (Università degli Studi diMilano - Bicocca) | Mittempergher, Silvia (Università degli Studi diMilano - Bicocca) | Berio, Luigi (Università degli Studi di Parma) | Balsamo, Fabrizio (Università degli Studi di Parma) | Succo, Andrea (Università degli Studi di Parma) | Storti, Fabrizio (Università degli Studi di Parma)
To select the "best outcrop analogue" of a subsurface field/prospect is always challenging, especially when dealing with fractured carbonatic reservoirs. The candidate should match the mechanical stratigraphy, the depositional conditions, the diagenetic history, the tectonic evolution. This is almost impossible, considering that at least the exhumation phase and the associated diagenetic features will not be shared between the outcropping analogue and the buried reservoir. Nevertheless, the analysis of natural analogues can provide useful indications particularly in a complex matter as fracture distribution; in fact, large-scale outcrop analogues reveal their potential when trying to fill the gap between seismic- and borehole-scale structural characterization.
In order to start building an "Atlas of Fracturing Facies" as a digital interactive catalogue of natural fractured analogues, three main cases have been studied: Pag (Croatia) and Parmelan (France) anticlines as analogues for folded and faulted platform carbonates affected by pre-folding extensional faulting, and the Gozo Island (Maltese Archipelago) as an example of carbonatic sequences affected by extensional tectonics. An integrated multiscale approach has been applied, from thin sections to outcrop scale analysis, from drone-based surveys to satellite image interpretation. This workflow leads to the reconstruction of 3D models, and to the quantification of the main parameters characterizing the fracture pattern and its variability.
The Island of Pag, External Dinarides of Croatia, is a thrust-related anticline that involves Upper Cretaceous to Eocene shallow-water carbonate platform sequences affected by tight folding during Eocene – Oligocene times. The fold evolution is multiphase, expressed by pre-folding features developed during a layer-parallel shortening with a strong influence of structural inheritance, followed by fold- and thrust-related cataclastic flow in hinge zones.
The Parmelan Anticline, in the Bornes Massif, Western Alps, is a box-fold involving Lower Cretaceous massive platform carbonates. It is characterized by steeply-dipping limbs, separated by a wide crestal plateau, delimited by narrow hinge zones localized on inherited extensional faults. Its polyphasic tectonic history has been reconstructed by analyzing the fracture and vein pattern, which highlighted the strong influence of structural inheritance during folding.
The Gozo Island is a Late Oligocene-Late miocene carbonatic sequence, composed by platform carbonates with different facies, affected by two extensional events associated to a mult-sets fracture pattern. In Gozo, spectacular coastal outcrops allowed analyzing the structural and statistical relationships between fractures and faults, in terms of density, length, orientation, spatial distribution patterns, and topology.
The Pag, Parmelan and Gozo case studies, together with several literature case studies, are the starting point of the implementation of an Atlas of Fracturing Facies, providing a multidisciplinary knowledge management and data repository platform to improve the prediction of fracture patterns in the subsurface, and its impact on porosity and permeability in reservoirs.
The complexity of the recovery mechanism generally increases with the maturation of a producing field; therefore, the granularity of the reservoir analysis must increase proportionally to better understand the well and reservoir dynamics. ADNOC’s Integrated Reservoir Management (IRM) Framework has instituted a set of workflows to focus on analyzing the reservoir performance at the sector level to assure reservoir performance sustainability. ADNOC has developed and implemented a robust automated Sector Performance Review (SPR) process using state-of-the-art analytics and business process management tool. The main objective of this work is to foster collaboration among multiple disciplines to assess the reservoir performance, as well as, to identify, interpret and implement profitable opportunities through a centralized platform (
A unique process has been implemented in the five major assets of the ADNOC group, which covers approximately 50% of UAE overall production. The system leverages an automated integration of subsurface data from numerous sources; live analytics visualization provides reservoir performance insights on the sector level through automatically calculated KPIs and diagnostic trends. (
The solution offers an innovative way to establish collaboration among different assets to assimilate reservoir performance insights through a sustainable platform. Some of the immediate benefits include the effective execution of a reservoir management scheme, monitoring the variance between actual and anticipated performances during the course of projects, and assuring production target compliance whilst mitigating the shortfalls proactively. In fact, a reservoir level analysis tool (RPR) was piloted during 2017 and presented in SPE-193012-MS. However, with the necessity of having analysis at the sector level, the overall design of the solution is transformed to allow the users to carry out the analysis at sector and well level granularity. The SPR (Sector Performance Review) is enabling the achievement of a consistent approach across all assets for all subsurface performance review processes while improving efficiency through automation of data gathering and presentation and the identification of all underperforming reservoir, sectors, and wells.
Seismic data is one of earliest data acquired in a prospect evaluation and the data are utilized throughout the exploration and production stages of a prospect. With recent advances in the handling of big data, it is essential to re-evaluate the best practices in the seismic data ecosystem. This paper presents the idea to leveragingthe technology advancement in big data and cloud computing for Seismic data ecosystem with the aim to providing an improve user experience.
This new seismic platform would be capable of handling, managing and delivering the full spectrum of seismic data varieties starting from acquired field data to interpretation ready processed data. The system to have the following capabilities: Capability to entitle the right portion of data to every user as per interest Organization of seismic data as per the business units Data security by sharing data only with legitimate users/groups. Direct or indirect integration with all the data sources and applications who are consuming and/or generating data Sharing of and collaboration on data within company and/or across organization for shareholding partner, perspective seismic buyer for trading and relinquishment, regulatory agency resource certifying agencies and service providers etc. over limited network connectivity. Provide intergration/data deliverivey to End Users applications where this seismic data will be utilizaed
Capability to entitle the right portion of data to every user as per interest
Organization of seismic data as per the business units
Data security by sharing data only with legitimate users/groups.
Direct or indirect integration with all the data sources and applications who are consuming and/or generating data
Sharing of and collaboration on data within company and/or across organization for shareholding partner, perspective seismic buyer for trading and relinquishment, regulatory agency resource certifying agencies and service providers etc. over limited network connectivity.
Provide intergration/data deliverivey to End Users applications where this seismic data will be utilizaed
Implementation of Seismic ecosystem will enable: Sharing of seismic data by the acquisition, quality control, data processing and interpretation with user communities from one centralized storage Collaboration of stake holders in real time over an encrypted network Leveraging cloud and mobility technology advancement for agility and interaction. The system will be connected and interactive yet has the power of complex high-performance computing infrastructure on the background. Data delivery and auditing to wider and more diverse user community that consumes data from different platforms. Secure data access based on organizational business units to make sure data does not fall into unauthorized hand. Reduction in seismic data turnaround time by reading and ingesting large volume of data through parallel input/output operation. Improved data delivery and map interface with contextual information out of the centralized data store. Augment traditional workflows with machine learning and artificial intelligence for example automated fault detection, etc.,
Sharing of seismic data by the acquisition, quality control, data processing and interpretation with user communities from one centralized storage
Collaboration of stake holders in real time over an encrypted network
Leveraging cloud and mobility technology advancement for agility and interaction. The system will be connected and interactive yet has the power of complex high-performance computing infrastructure on the background.
Data delivery and auditing to wider and more diverse user community that consumes data from different platforms.
Secure data access based on organizational business units to make sure data does not fall into unauthorized hand.
Reduction in seismic data turnaround time by reading and ingesting large volume of data through parallel input/output operation.
Improved data delivery and map interface with contextual information out of the centralized data store.
Augment traditional workflows with machine learning and artificial intelligence for example automated fault detection, etc.,
The proposed best practice aims to bring all of the different disciplines working with seismic data to one centralized seismic data repository and enable them to consume and share seismic data from big data lake. This is live and interactive when compared to traditional technologies of using the archive and restore system in standalone application.
Oil and Gas, Petroleum Refining Industry is undergroing through significant Business Challenges, Transformation and Consolidation. Refinery Business Economics has been encountering increased challenges due to significant variations in the market dynamics. Refining Optimization has acquired renewed importance in the past decade and optimization of the existing facilities has become a key component in the overall Business Strategy. Operational Excellence forms the necessary attribute for Optimal Business Performance of the Progressive Companies. Operational Excellence and Business Excellence are the key focus areas for Petroleum Refining Industry in the Middle East and Other Regions. Operational Excellence and Business Excellence requires best in class performance in all the functional areas of relevance to the Business. Operational Excellence and Business Excellence are closely inter- related and complement one another towards superior Business Performance. Operational Excellence Groups in the company deal with activities that would help to inculcate culture of excellence and help in the effective implementation of continuous improvement activities across the company facilities. Periodic Assessment of Operational Excellence is extremely important for identifying the status and take necessary corrective measures /additional measures towards continuous improvement. ADNOC Refining has been involved in Operational Excellence activities and Business Excellence activities for the past two decades with and without the involvement of external consultants. This paper deals with key aspects of Operational Excellence review and methodology of systematic implementation, key challenges, and experiences of first of its kind simultaneous implementation across ADNOC Refining Operating Facilities.
Infosys will acquire Houston-based Noah Consulting, an advanced information management consulting services company for the oil and gas industry, in a cash deal worth USD 70 million. The new acquisition will combine Noah's industry knowledge, information strategy planning, data governance, and architecture capabilities with Infosys' ability to provide technology and outsourcing services on a global scale to oil and gas customers. The sale is expected to close before the end of the third quarter of next year.