|Theme||Visible||Selectable||Appearance||Zoom Range (now: 0)|
The paper proposes a more sophisticated approach to the understanding of risk by approaching risk as the interaction of distributions on orthogonal dimensions. This approach enables novel understanding of not only incident causation but also of the solution strategies employed to prevent incidents. It also proposes a different approach to the measurement of the safety of an operation that avoids simply equating safety with not having incidents.
All organisations make money by taking risks and reducing them to a point where the outcomes are profitable for the organisation. In general the main risks an organisation takes are seen as financial risks, however this is an over-simplification. Organisations also take risks in all their operational decisions as well. For high-hazard organisations this is brought to the fore as failures in quality and safety are more pronounced and have much higher profile outcomes.
Risk assessments combine scenario outcomes and expected frequencies. Outcomes are often simplified, such as loss of primary containment or fatality, reducing the wide range of possible variations and causes; the entire outcome distribution may be collapsed to 4, 5 or 6 categories of severity. Frequencies are often simplified to orders of magnitude as point probabilities; a more sophisticated approach uses Bayesian conditional probabilities as distributions. The Rule of Three, a simple but powerful risk management tool originally developed to understand helicopter accidents, uses a number of distinct dimensions that capture the multi-dimensionality of a variety of conditional causal factors.
Risk space typically has about 7 or 8 major orthogonal dimensions (e.g. equipment, people, weather, plan, environment etc.). This complex interaction paradigm provides added insights and understanding not only about complex process incidents, but also the increasing difficulty of attaining "goal zero", as the tools managing the risks are built with less sophisticated methodologies. In short, trying harder no longer delivers improved performance.
Risk space plots the system's sensitivity to upset as the result of a triggering event. One major advantage of the multi-dimensional risk space approach is that we can now define safety without reference to having or not having accidents. Safe operations can still have accidents, if the trigger is severe enough, unsafe operations can still survive if the triggers are small enough.
By identifying distinct dimensions that can be combined, risk space enables us to plot smaller high-risk areas that are missed in more broad-brush approaches where most activity is safe but unexpected risks can present themselves. It is also possible to examine the trigger effects that may take normally safe systems into danger.
The paper will briefly discuss how these operational risks are understood in modern day accident causation models. It will also address how a change of paradigm may enable organisations to improve their understanding of the risks.
The paper introduces a multi-dimensional approach to understanding and managing risk. The approach provides greater explanatory power for incident causation and allows measurement of safety that is less sensitive to ‘good’ or ‘bad’ luck and more accurately reflects the true level of safety of an organisation.
Casing and tubing strings are the main parts of the well construction. All wells drilled for the purpose of oil/gas production (or injecting materials into underground formations) must be cased with material with sufficient strength and functionality. Therefore, this chapter provides the basic knowledge for practical casing and tubing strength evaluation and design. Casing is the major structural component of a well. Casing is needed to maintain borehole stability, prevent contamination of water sands, isolate water from producing formations, and control well pressures during drilling, production, and workover operations. Casing provides locations for the installation of blowout preventers, wellhead equipment, production packers, and production tubing. The cost of casing is a major part of the overall well cost, so selection of casing size, grade, connectors, and setting depth is a primary engineering and economic consideration. Tubing is the conduit through which oil and gas are ...
Kansao, Rami (Quantum Reservoir Impact, LLC) | Benhallam, Wassim (Quantum Reservoir Impact, LLC) | Aronovitz, Alexander (Quantum Reservoir Impact, LLC) | Ravuri, Meher (Quantum Reservoir Impact, LLC) | Maqui, Agustin (Quantum Reservoir Impact, LLC) | Suicmez, Vural (Quantum Reservoir Impact, LLC) | Castineira, David (Quantum Reservoir Impact, LLC) | Darabi, Hamed (Quantum Reservoir Impact, LLC)
An integral aspect of field development planning is the process of identifying remaining, feasible, and actionable field development opportunities (FDOs) such as recompletions and new drills. In mature brown fields, this process is extremely difficult to undertake in a reasonable timeframe due to the sheer size of the available datasets (e.g., thousands of wells, decades of production history), geologic complexity (e.g., dozens of layers, faulting, folding, fractures, pinchouts), and engineering challenges (e.g., commingled production and injection, field compartmentalization) unique to each reservoir.
The automated framework is a fully streamlined framework that aims at building a comprehensive opportunity inventory consisting of behind-pipe recompletion opportunities, vertical new drill locations, sidetrack opportunities, and optimal deviated/horizontal targets. In addition to opportunity identification, it can generate additional deliverables for vetting purposes including a well book report and a standardized Petrel project. The main objective of this process includes: 1) fast, data-driven, and consistent approach for evaluating and identifying field development opportunities, 2) fast execution time, accelerating processes from months to days, 3) integration of multi-disciplinary data, 4) consistency and repeatability, minimizing subjectivity, and 5) streamlined visuals for opportunity vetting.
This framework has been successfully applied to a mature field in the Middle East with more than 1,000 producers and 55 years of production history. Hundreds of recompletion opportunities were initially identified for the field. A final list of opportunities is then generated by running the opportunities through a series of user-provided attribute filters such as minimum initial rate, minimum oil thickness, maximum acceptable uncertainty, and so on. The process is then typically finalized with manual vetting of the opportunities by subject matter experts including geologists, reservoir engineers, and production engineers that ensure the opportunities are geologically sound, mechanically feasible, and meet various validation criteria that are discussed in this document. A final list identified 116 recompletion opportunities in 116 wells with a total potential production upside of around 63,000 B/D.
Ferro, Santiago (ANCAP) | Rodríguez, Pablo (ANCAP) | Tomasini, Juan (ANCAP) | Gristo, Pablo (ANCAP) | Blánquez, Natalia (ANCAP) | Conti, Bruno (ANCAP) | Romeu, Cecilia (ANCAP) | Marmisolle, Josefina (ANCAP)
Recently, it was announced the first oil recovered in surface onshore Uruguay, in Norte Basin. Additionally, operators and Uruguayan National Oil Company (ANCAP) have identified many prospects, which are almost ready to be drilled, at different water depths offshore Uruguay. Therefore, it is relevant to know the minimum volume of hydrocarbons contained in hypothetical conventional discoveries, necessary to make the projects economically feasible in the light of the new Open Uruguay Round fiscal regime, both onshore and offshore.
The prospective resources distribution was calculated for several prospects, which were identified with 2D and/or 3D seismic, and are located onshore and offshore at various water depths. The probabilistic cash flow analysis including forecasts for production, capital and operating costs, product prices and the new fiscal regime was conducted in all cases to define the breakeven oil prices and the Minimum Economic Field Size (MEFS). Finally, the Net Present Value, the probability of geological success and the probability of development were considered to complete the expected monetary value and the probability of commercial success calculations.
Many aspects can be taken away from this work. Firstly, as a general trend, the MEFS and the breakeven oil price required for the prospects increase with water depth, reflecting the rise in expenditures. However, specific prospect parameters may cause a separation from this clear trend. Likewise, the decision to sell or reinject natural gas has a significant influence on project profitability. Additionally, the probabilities of geological success, in the abscense of productive analogues in the south Atlantic, are still low in all cases, reinforcing the status of frontier exploration of the Uruguayan sedimentary basins. Hence, with the objective to overcome this weakness and encourage exploration, the minimum government take established in the new Open Uruguay Round regime is fairly low. In conclusion, projects onshore and offshore could be equally profitable and successful, requiring higher size of discoveries and price of products as water depth increases. However, because of the remarkable difference in investment and costs, onshore opportunities fit for smaller companies’ portfolios and ultradeepwater prospects are almost exclusive for the majors.
This paper presents new and useful information to geoscientists, engineers and managers of International Oil Companies (IOCs) evaluating exploratory projects in Uruguay, as it includes the novel fiscal regime in force. Finally, it proposes an innovative methodology to calculate the MEFS, breakeven oil prices and the probability of commercial success from the stochastic model.
With the increasing complexity of wells being drilled to optimally access as much of the reservoir as possible, an integrated drilling risk assessment strategy needs to be presented to address the increasing level of challenges and risks. Wells are being drilled deeper, through intervals that are heavily depleted into virgin, unexplored reservoirs. They are extending outward to horizontal lengths that were not even dreamed of several years ago. The assessment process described in this publication incorporates a better understanding of the complex geologic environment such as a multitude of pressure compartments, complex faulting and lithological variations that could potentially increase the risk magnitude. With these challenges, there is a step change in understanding and quantifying these hazards, risk liabilities and uncertainties that need to be addressed to make these projects commercially viable and be able to drill down to their designed targets without cost overruns due to NPT, and not less important, keeping a stable borehole capable of being evaluated and resulting in successful production rate.
In order to address these challenges, more upfront in-depth planning and analysis must be performed. Often times, a review of the drilling issues are performed around symptoms of drilling events to respond to them, but not necessarily to determine why these events happened in the first place – the root cause. This shows when contingencies are built into a well plan to account for the NPT consequences of these drilling events observed in offset wells. In order to proactively address these increasing risks and hazards, development and incorporation of a detail risk mitigation strategy is required.
Through experience, leveraging drilling, geomechanical and geological models in the risk assessment process, has proven to be successful in understanding the hazards that are present. Bringing out the potential hazards is just part of the strategy. A multi-discipline team needs to analyze each potential hazard and determine the best mitigation strategy to mitigate that risk down to a safe, acceptable level. The collaboration of a multi-discipline team is the key component to understand all the different drilling, geological, geomechanical and fluids influences on the potential hazard. With this type of approach in assessing the risk, the process can then address all the potentially different variables required to design an intelligent, proactive drilling strategy. Even with the best strategy in place, there are geologic uncertainties that still need to be analyzed. Knowing these uncertainties, this multi-discipline team designs a mitigation strategy utilizing the right mix of technologies and real-time monitoring component to be able to update models to fine-tune the drilling window and to provide recommendations while drilling - to optimize the drilling process.
Showing that this process is effective, several results will be discussed, demonstrating the value of incorporating a pre-drill risk assessment along with real-time monitoring to reduce drilling time, uncertainties and to successfully provide safe and predictable operations from start to finish that provide ideal wellbore conditions for completion.
Zhang, Hongbao (Sinopec Research Institute of Petroleum Engineering) | Zeng, Yijin (Sinopec Research Institute of Petroleum Engineering) | Bao, Hongzhi (Sinopec Research Institute of Petroleum Engineering) | Liao, Lulu (Sinopec Research Institute of Petroleum Engineering) | Song, Jian (Sinopec International Exploration and Production Corporation) | Huang, Zaifu (Sinopec International Exploration and Production Corporation) | Chen, Xinjin (Sinopec International Exploration and Production Corporation) | Wang, Zhifa (Sinopec Research Institute of Petroleum Engineering / Sinopec Tech Middle East LLC) | Xu, Yang (Sinopec International Exploration and Production Corporation) | Jin, Xin (Sinopec Research Institute of Petroleum Engineering)
In unconventional oil & gas fields, relative high well density and consistent well programs lead to more data and potentials to improve operation by data-driven methods. In the fields, daily reports management systems are deployed to manage daily drilling & completion operation information, where maybe millions of reports are stored. Drilling & completion activities are recorded in the daily reports, engineers always take a lot of time to analyze abnormal activities in them for operation monitoring, contractor performance evaluation and decision making, such as pipe sticking, mud loss and influx, etc. Because of the different activity coding systems between projects from different countries and data missing problems, for big or international oil companies, it’s difficult to conduct automatic and unified anomaly detection based on the database by rule-based methods.
Text classification based on machine learning and natural language processing techniques provides new solutions to automatic anomaly detection in daily reports. According to the characteristics of the drilling & completion activity recording texts, an automatic text classification method was proposed based on 460,000 operation records of 1,700 wells from 80 projects around the world. Word2vec is used to represent the texts in numbers, over-sampling and deep learning cost function optimization were used to overcome the data imbalance. Performance of several machine learning models was evaluated by ROC curve, such as naive bayes, random forest, support vector machine, long short-term memory (LSTM) and convolutional neural network (CNN), etc. and CNN was proved to be the best text classification method for daily drilling & completion report.
Finally, a 7-layer CNN model was trained to detect anomaly in daily drilling & completion reports with accuracy of 85% and was successfully deployed in an online analysis system. The text classification model helped to shorten the data analysis time and increase the technical management efficiency of the oil company.
The result proved the potential of machine learning and natural language processing in drilling & completion reports mining and laid the foundation for more intelligent solutions, such as knowledge discovering, operation problem early warning, etc.
Materials are made of distinct constituents. Connectivity of material constituents govern several physical properties, such as transport, mechanical, and electromagnetic properties. High-resolution microscopy imaging of a material is the best way to capture the microstructural aspects describing the distribution, topology and morphology of various material constituents. In this study, we develop two novel connectivity-quantification metrics for robust quantification of connectivity of material constituents captured in high-resolution images. Two-point connectivity function and fast-marching-based travel-time histograms are developed to quantify connectivity of each type of material constituent captured in the images. Two-point connectivity function for a specific constituent type is computed as a function of separation distance between two randomly selected pixels belonging to a specific constituent type. On the other hand, a fast-marching-based travel-time histogram for a specific constituent type is generated by using the fast marching method to compute the time taken by monotonically advancing interfaces starting from several randomly selected pixels to travel to each pixel belonging to the specific constituent type. The travel-time histogram indicates the tortuosity of connected paths, whereas the connectivity function indicates the length-scale of dominant globular connectivity. As a scalar measure of connectivity, the distributions corresponding to the two metrics are transformed to an average connected distance derived from the connectivity function and average travel time derived from fast-marching calculations. The performances of these two connectivity-quantification metrics are tested on 1500 images belonging to three categories of connectivity, namely poor, intermediate, and good connectivity, with 500 images for each category. Then, the metrics are evaluated on the organic constituent captured in the scanning electron microscopy (SEM) images of rock samples from various shale formations. Material constituents exhibiting high connectivity results in large values of average travel time and average connected distance. The average connected distances for the three categories of connectivity are 140.1, 14.6, and 5.6 pixels, respectively. The average travel times for the three categories of connectivity are 34.1, 5.2, and 1.9 seconds. The quantifications of connectivity using the two metrics show good agreement with each other and with visual inspection. For the two real SEM images exhibiting good connectivity and poor connectivity of the organic constituent, the average connected distances are 125.9 and 25.5 pixels, respectively, and the average travel times are 24.6 and 6.2 seconds, respectively, which confirms the robust performance of the metrics.