Development wells are drilled for different purposes that varies based on the business needs. Some scenarios can be for flowing production, power water injector, gas, or to monitor the performance of a well. Coupled with the technology in wellbore drilling activity and the artifacts that are required for the workflow approval. Through evolving technology, the system captures every step of wellbore activity to address the business needs and streamline the approval process. Data fuels Industry Revolution (IR) 4.0 and effective data analytics is the prerequisite for successful implementation of digital enterprise applications.
Well approval is an in-house developed system embedded in Petrel application for reservoir characterization geologists. It standardizes well approval documents across the organization. The geologists may generate multiple well approval documents in case a change in the plan has occurred. Different scenarios require geologists to replace the plan that reflects the well approval document. Planning is the main pillar in the process of Well Approval, and automating the planning part simplifies the well approval process. The system automates the generation of three critical documents. First, Integrated Requirements (IRs), which consists of logging and threshold values for each hole section generates the reservoir amplitude impedance and porosity maps, and finally allows the geologist to capture seismic maps from 3D geological models. Second, Prognosis geological horizons details. Third, Location Evaluation. The backbone of this system is that it is integrated with different back-end systems such as corporate database, Petrel, Openwork and Geolog. In addition, geologists are not required to select the offset wells. The offset wells algorithm designates the wells for geologies based on specific criteria that is built in the system such as reservoir type and dynamic log selection. The system tackles both environments 2D and 3D based on the giving parameters such as facie. The system is able to detect the environment and prepare the documents accordingly.
Since the system is aligned with the business pattern for well approval process, it significantly improves the time required to create these documents from two to three (2-3) days to less than 1 hour (90%) and the data will be readily available through Real-Time Operations Center Knowledge System (also known as ROCKS) for further analysis and reporting.
Sour production from offshore and land-based wells causes hydrogen sulfide (H2S) release during downhole and topsides operations. Improper handling of H2S can lead to serious environmental and safety concerns as well as numerous corrosion and compliance issues. Consequently, H2S can add significantly to the total cost of well operations. The application of efficient H2S management technologies can reduce environmental and safety concerns, enable the use of lower-cost materials, and comply with H2S specifications. To remove H2S from mixed production applications, several chemistries are commonly used. The most common are triazines, glyoxal, and metal-based chemistries. Although each can be effective to a certain extent, these technologies have issues with efficiency or they can create serious side issues. The reaction of triazines with H2S in mixed production is highly inefficient and it creates scaling. Glyoxals suffer from poor efficiency, thermal instability, and corrosivity. The metal-based chemistries are the most efficient in mixed production, but in certain application regimes they can create serious solids and emulsion issues. These challenges can increase CAPEX and OPEX as well as lead to significant downtime and lost production. To overcome issues with currently used chemistries in mixed sour production, extensive research was conducted to identify chemistry that would efficiently remove H2S while minimizing negative side effects.
Systematic evaluation was performed for a series of chemistries to compare the scavenging efficiency, with a special emphasis on mixed production systems. Focus was also given on studying the associated side effects like emulsification tendency, scaling tendency, etc. to ensure the chemistry had no/minimal side effects seen by the more conventional chemistries. A high-throughput lab technique is presented that was designed to mimic scavenging tendency in sour mixed production environment. A continuous gas flow testing technique that helped study the reaction kinetics is also described.
Laboratory and pre-field results proved the efficacy of the new non-MEA, non-triazine chemistry in mitigating H2S in upstream, midstream and downstream applications while being especially efficient in mixed production systems. Laboratory testing proved the chemistry to be highly efficient compared to triazine in mixed production systems. Results also indicated the chemistry is non-emulsion forming and has very little scaling tendency. Testing conducted in the field demonstrated that the new chemistry cost-effectively removes H2S and meets the operator specifications.
The novel, non-triazine scavenger technology has significantly better performance than triazine, no emulsion concerns, acceptable HSE, non-corrosive effects, and less downstream concern than MEA triazine or metal-based scavengers. The new and differentiated chemistry reduces CAPEX and OPEX, drives productivity, improves reliability and reduces non-productive time.
Carbonate reservoirs are often comprised of a heterogeneous pore system within a matrix of variably distributed minerals including anhydrite, dolomite, and calcite. When describing carbonate thin sections, it is routine to assign relative abundance levels to each of these components, which are qualitative to semi-quantitative (e.g. point-counting) and vary greatly depending on the petrographer. Over the past few decades, image analysis has gained wide use among petrographers, however, thin section characterization using this technique has been primarily limited to the pore space due to the difficulty associated with optical recognition beyond the blue-dyed epoxy associated with the pores. Here, we present a new method of computerized object-based image segmentation (Quantitative Digital Petrography: QDP) that relies on a predefined rule set to enable rapid, automated thin section quantification with only minor human interaction. We have developed a novel work flow that automatically isolates the sample on a high-resolution (i.e. <1μm/pixel) scanned thin section, segments the image, and assigns those segments to predefined categories – e.g., pores, cement, grains, etc. Using this technique, statistically relevant numbers of thin sections can be rapidly processed and quality controlled, thereby allowing quantitative data such as MICP, wettability, and surveillance data to be integrated with the petrographic observations for a more complete description of the carbonate rock. Our technique can also incorporate multiple layers, such as cross-polarization, Back Scatter Electron (BSE) imaging, and elemental maps, which allow additional information to be easily integrated with results from QDP. The QDP approach is a significant improvement over previous digital image analysis methods because it 1) does not require binarization, 2) eliminates the subjectivity in assessing abundance levels, 3) requires less hands-on time for the petrographer, and 4) provides a much fuller dataset that can be incorporated across an entire well or field to better address common challenges associated with carbonate reservoir characterization, such as understanding pore type and cement abundance, pore connectivity, grain distribution, and reservoir flow characteristics.
Subsea Production Systems—Will 2019 Be a Tipping Point? The Golfinho gas field development is among the high-profile projects for which operators agreed to EPC contracts prior to FID. The 286 subsea trees ordered in 2018 was the highest quantity since 2013 and suggests an industry firmly on the right side of a recovery. Increasing orders are a welcome relief for the still-beleaguered subsea supply chain but have yet to translate into meaningful revenue growth. Revenues have remained relatively static over the past 24 months as the growing number of subsea projects hitting the EPCI phase is offset by a transition from high-value pre-2014 backlog to much lower- priced post-downturn contracts.
ABSTRACT Robustness is a system property that reflects its resistance to the initiation of hazardous events and their progression beyond defined limit state thresholds. Under scoped perturbations a robust system should not display disproportionate response or cascading failures. Three levels of robustness are identified: Operational (R1), Survival (R2) and Reserve (R3) that span responses in state space ranging from the safe operating limit (SOL) up to system failure at the ALS limit. Systems engineering methods can be used to define system boundaries then map subsystems and their interactions. HAZIDS can be used to identify critical failure modes for engineering analysis. Risk methods can be used to assess the acceptability of the robustness measure. Robustness is a system property that reflects its resistance to the initiation of hazardous events and their progression beyond defined limit state thresholds. A robust system maintains essential minimum features under scoped perturbations.
Not only that, it’s hard to operate, evolve, and troubleshoot. This post explores how applying the functional programming paradigm to data engineering can bring a lot of clarity to the process. This post distills fragments of wisdom accumulated while working at Yahoo, Facebook, Airbnb, and Lyft, with the perspective of well over a decade of data warehousing and data engineering experience. "In computer science, functional programming is a programming paradigm — a style of building the structure and elements of computer programs — that treats computation as the evaluation of mathematical functions and avoids changing-state and mutable data. It is a declarative programming paradigm, which means programming is done with expressions or declarations instead of statements.
Awan, Kamran (Petroleum Development Oman) | Al Aufi, Mohammed (Petroleum Development Oman) | Al Salti, Hilal (Petroleum Development Oman) | Al Noumani, Hussain (Petroleum Development Oman) | Nabavi, Bijan (Petroleum Development Oman) | Al Ghaithy, Ali (Petroleum Development Oman) | Al Busaidi, Khamis (Petroleum Development Oman) | Al Harrasi, Ali (Petroleum Development Oman) | Al Lamki, Ali (Petroleum Development Oman) | Al Mujaini, Rahima (Petroleum Development Oman) | Al Salhi, Mohammed (Petroleum Development Oman) | Al Nadabi, Seif (Petroleum Development Oman) | Al Abri, Abdulla (Petroleum Development Oman) | Al Zaabi, Yousef (Petroleum Development Oman) | Al Busaidi, Salim (Petroleum Development Oman)
'Sweating the Asset’ is an integrated change management approach for maximizing cheap oil production from existing fields and facilities, without capital expenditure. ‘Sweating the Asset’ utilizes a
The ‘Sweating the Asset’ goal was introduced as an organisational initiative in Q1 2017 with the aim of helping producing assets close their ‘gap to potential’ and operate as closely as possible to technical limit. The approach enables team leaders managing different components of the integrated production system to focus on a common goal and make aligned decisions.
The structured ‘Sweat the Asset’ process integrates components of the company's Lean Management System (LMS), including:
'Sweating the Asset’ has been deployed in 13 production systems within the organisation and currently at different levels of maturity. As an example of goal deployment, an EOR polymer injection facility with suboptimal performance, poor compliance with injection and viscosity requirements had resulted in a severe decline in oil production. In order to safeguard production and close an estimated production gap of 4000 barrels/day, a goal was set to
The plan for 2018 is to ensure that ‘Sweating the Asset’ is fully embedded and sustainable in all assets across the organisation. This may be seen as step change in, and the next level of Wells, Reservoir and Facilities Management (WRFM) maturity. This paper will primarily focus on the Goal Deployment process and strategy.
Mari Gas Field; discovered in 1957, is one of the oldest and mature fields of Pakistan and currently the highest gas producing field of the country. Mari Field has vital role in ensuring the food security of Pakistan as it is meeting gas needs of 87% of Fertilizer Industry. Field has been developed in multiple phases with 97 producing wells and complex gas gathering network of 284 km. In recent past few years, the country's supply and demand gap for gas is on widening trend. In order to support national energy needs, MPCL management decided to take initiative of executing gas incremental and optimization drive for playing its part in meeting the crucial energy demand of country. Furtermore, it was also matter of great importance to sustain the enhanced production plateau and meeting production & revenue targets.
Multidisciplinary teams were formed for evaluation of complete system from Reservoir to delivery end. Integrated approach was adopted for assessment of system at existing state of production and targeted production rate as de-coupled models would have not served the purpose well in this particular case. This study covers the integration of sub-surface and surface facilities; linking reservoir simulator at IPR Level, through in-house simulation model developed in commercial simulator. Keeping in view the target of study i.e. incremental and sustained production, reservoir rates were constrained and hence semi-implicit approach was used instead of fully implicit modeling which saved a lot of computation time and avoided complexities. Coupled model was simulated for predicting futuristic behavior of system at enhanced production which identified bottlenecks and integrity problems in existing system and also proposed the solutions through series of sensitivities analysis. This exercise triggered the proactive approach for solving system issues and kicked-off proper surveillance of system which are mandatory to sustain the desired production plateau and ensure the mechanical integrity of system. This study further emphasized that evaluation of complete system through integrated modeling can save compression cost at by de-bottlenecking the existing infrastructure.
At present, the field is producing at 20% incremental volume over and above of initial base volume and future plans have also been formulated to sustain the production plateau as per reservoir simulator.
Maritime Classification Societies have traditionally been the primary source for verification and validation of ship design, construction, and maintenance. In 1929, they began looking at the aviation industry as a new market for their services. The recent commercialization of aircraft along with similarities in calculations and design elements to ocean-going vessels caused the aviation industry, government entities, and insurance companies to turn to Classification Societies for verification and validation services. These services were to be fulfilled through the development and enforcement of Rule sets as they had done for the maritime industry. These Rules for classing aircraft would have promoted structural integrity, airworthiness, and a culture of safety. Given that these societies had engineers consisting primarily of naval architects and marine engineers instead of those already in aeronautics, Classification Societies' organizational structure did not properly align with the aviation industry's need for verification and validation processes. Maintaining an organizational structure focused around empirically gained maritime knowledge, along with the rapid growth of accessible tools being developed by aircraft designers ultimately cause Classification Societies to leave the industry by 1939 due in large part to the fact that government, not private industry, was taking control of the process. Since this experience, Classification Societies have been hesitant to provide verification and validation services to sectors outside that of ocean structures. Today, Classification Societies have a potential opportunity to provide organizational guidance on verification and validation processes in new industries such as maritime cyber security.
While swaged bulkheads have progressively become accepted for non-load bearing locations on U.S. built commercial ships, their widespread application in U.S. shipbuilding practice has been slowed by the lack of specific American Bureau of Shipbuilding (ABS) rules for calculating the capacities of these configurations in load bearing situations. NASSCO has carried out several National Shipyard Research Program (NSRP) projects between 2010 and 2015 to collect a body of data on the strength of swaged bulkheads, including physical testing of full-size specimens, as well as linear and non-linear finite element analyses. The goal of the current Swage Panel SVR Rule Development Project (NSRP Research Announcement 2015-402) is to work towards the implementation of swage technology into commercial vessels built in the US. Currently in the third and final phase, this project furthers the previous work performed studying swage bulkheads. NASSCO has partnered with San Diego State University (SDSU) and ABS to develop swage bulkhead design guidance specific to the commercial use of swage panels in load bearing applications that will be incorporated into the ABS Steel Vessel Rules (SVR). Physical testing to validate the local swage bulkhead non-linear finite element analysis (NLFEA) predictions has been conducted in the structures laboratory at SDSU, including compression and shear load cases. Large scale NLFEA performed using MAESTRO will incorporate swage bulkheads in vessel superstructures under realistic load profiles. This project’s outcome will be a proposed rule for the 2020 ABS SVR based on the project’s research, with scheduled completion in December 2018.