|Theme||Visible||Selectable||Appearance||Zoom Range (now: 0)|
Oilfield tubulars have been traditionally designed using a deterministic working stress design (WSD) approach, which is based on multipliers called safety factors (SFs). The primary role of a safety factor is to account for uncertainties in the design variables and parameters, primarily the load effect and the strength or resistance of the structure. While based on experience, these factors give no indication of the probability of failure of a given structure, as they do not explicitly consider the randomness of the design variables and parameters. Moreover, the safety factors tend to be rather conservative, and most limits of design are established using failure criteria based on elastic theory. Reliability-based approaches are probabilistic in nature and explicitly identify all the design variables and parameters that determine the load effect and strength of the structure.
In the context of risk, uncertainty assessment is the process of combining uncertainties, as with a Monte Carlo Simulation, to generate output parameters that are represented by probability distributions. No involvement of risk is implied. Example: multiplying ranges of length and width to arrive at a range of resulting areas.
Abstract Fluid saturation data obtained from core analysis are used as control points for log calibration, saturation modeling and sweep evaluation. These lab-derived data are often viewed as ground-truth values without fundamentally understanding the key limitations of experimental procedures or scrutinizing the accuracy of measured lab data. This paper presents a unique assessment of sponge core data through parameterization, uncertainty analysis and Monte-Carlo modeling of critical variables influencing lab-derived saturation results. This work examines typical lab data and reservoir information that could impact final saturation results in sponge coring. We dissected and analyzed ranges of standard raw data from Dean-Stark and spectrometric analysis (including, gravimetric weights, distilled water volumes, pore volumes and sponge’s absorbance), input variables of fluid and rock properties (such as, water salinity, formation volume factors, plug’s dimension and stress corrections), governing equations (including, salt correction factors, water density correlations and lab mass balance equations) and other factors (for instance, sources of water salinity, filtrate invasion, bleeding by gas liberation and water evaporation). Based on our investigation, we have identified and statistically parameterized 11 key variables to quantify the uncertainty in lab-derived fluid saturation data in sponge cores. The variables’ uncertainties were mapped into continuous distributions and randomly sampled by Monte-Carlo simulation to generate probabilistic saturation models for sponge cores. Simulation results indicate the significance of the water salinity parameter in mixed salinity environments, ranging between 20,000 to 150,000 ppm. This varied range of water salinity produces a wide uncertainty spectrum of core oil saturation in the range of +/- 3 to 10% saturation unit. Consequently, we developed two unique salinity variance models to capture the water salinity effect and minimize the uncertainty in the calculation of core saturation. The first model uses a material balance to solve for the salinity given the distilled water volume and gravimetric weight difference of the sample before and after leaching. The second model iteratively estimates the salinity required to achieve 100% of total fluids saturation at reservoir condition after correcting for the bleeding, stress and water evaporation effects. Our work shows that these derived models of water salinity are consistent with water salinity data from surface and bottom-hole samples. Despite the prominence of applications of core saturation data in many aspects of the industry, thorough investigation into its quality and accuracy is usually overlooked. To the best of our knowledge, this is the first paper to present a novel analysis of the uncertainty coupled with Monte-Carlo simulation of lab-derived saturation’s data from sponge cores. The modeling approach and results highlighted in this work provide the fundamental framework for modern uncertainty assessment of core data.
The process of identifying and evaluating the technical and non-technical risks associated with a project. It includes the amount or degree of potential danger perceived (by an assessor) when determining a course of action to accomplish a given task. Risk assessment may be qualitative or quantitative.
Abstract In September 2018, a field engineer’s mistake while replacing cast iron main in Lawrence, Massachusetts (USA), led to a death and over a billion dollars of property damage. Residents and businesses in the area were without natural gas service for months, at a time of the year when temperatures had already started dropping. One year later, every natural gas utility in the United States is bearing a share of the brunt of the backlash, as the National Transportation Safety Board starts to lay the groundwork for increased regulations nationwide. One aspect of the heightened attention on natural gas utilities is the vulnerability of the regulator stations themselves, even though the cause of the disaster cited above was traced back to a misplaced sensing line. Regulator stations are such a critical element of a distribution network that any comprehensive risk analysis program must include them. One tool that could help is a regulator station risk model that evaluates the potential fail open and fail closed scenarios for numerous regulator station design types. The likelihood of these events is then paired with the potential consequences, including health and safety, property damage, and network outage impacts. The output of these risk models form a critical set of inputs for use in an overarching probabilistic risk model, as being promoted by the Pipeline and Hazardous Materials Safety Administration (PHMSA), to support risk-informed decision-making. The early part of this paper describes the regulator station risk model itself. The latter part documents the methodology used to provide critical information required by the risk model for determining consequence of failure. Finally, a parametric study approach is added to analyze the hydraulic consequences of regulator stations completely failing open. This paper discusses how the application generates results, including number of customers affected, how much pressure is seen downstream, station scoring, and total length of pipe overpressured. This analysis is performed wholly on Microsoft Azure public cloud, using a version of a commonly-used hydraulic engine from a desktop product. This architecture enables an entire system to be analyzed in minutes compared to hours or days for an engineer to complete manually. Time is always a barrier to performing lengthy analyses, and this application helps to erode that barrier by enabling utilities to generate the results they need to perform risk analyses more frequently. This is important because these natural gas systems are constantly in states of disrepair or repair. Having an up-to-date risk model helps to keep the public safe and helps the utility to make the best decisions for managing and operating their system. DNV works directly with clients to implement necessary measures to define threats and comply with federal and local regulations. The paper’s authors intend on partnering with select utilities to develop comprehensive station risk management plans, which will help refine the methodologies described in the paper. This may not take place until after this paper is published, so any advancements derived from these partnerships may have to wait for publication at the 2021 conference.
Key Takeaways - Beliefs in the myths of zero energy and zero risk coupled with myths of guarding constrain efforts for prevention through design (PTD). - Understanding the history of OSHA and the agency’s encouragement of the use of voluntary standards allows safety professionals to have a realistic assessment of the current state. - With safety professionals leading the way for risk assessment and feasible risk reduction based on the hazard control hierarchy, PTD can move forward with potential significant strides in the decade of the 2020s. Significant advances in prevention through design (PTD) have been made in recent years. ANSI B11.0-2010 introduced a model for life cycle risk assessment in general industry machinery and equipment. That voluntary standard has continued to build on the concepts of PTD with its updates in 2015 and 2020. Other ANSI standards and articles continue to keep opportunities and challenges at the forefront in the global community of safety professionals. This article addresses the narrow slice of PTD for machine guarding and control of hazardous energy to explore several issues: 1. how we got to today’s current state; 2. how beliefs of safety professionals and engineers sometimes conflict with the strict language of OSHA; and 3. how voluntary ANSI standards can guide us in pursuits of PTD. With that understanding, this article builds on the work of Main (2020) in “New Opportunities in Safety: Lessons From a Risk Assessment Journey.” It also shows how ANSI B11.0 can be used to comply with OSHA’s General Duty Clause and assist in compliance with the complexities of process safety management.
The International Organization of Standardization (ISO) has issued a new ISO 15663 containing requirements and guidance on how to apply life cycle costing methodologies to provide decision support for selection between alternative competing options. The document specifies requirements and gives guidance on the application of costing to create value for the development activities and operations associated with drilling, exploitation, processing, and transport of petroleum, petrochemical, and natural gas resources. Life cycle costing is an iterative process of planning, estimating, and monitoring costs and revenue differences throughout an asset's life and is used to support the decision-making process by evaluating alternative options and performing trade-off studies. While the largest benefits are typically achieved in the early life cycle phases, it is equally applicable to all phases and at many levels of detail. Cost management within the petroleum, petrochemical, and natural gas industries will benefit from the adoption of a common and consistent approach to life cycle costing, according to the organization.
Alkinani, Husam Hasan (Missouri University of Science and Technology) | Al-Hameedi, Abo Taleb Tuama (Missouri University of Science and Technology) | Dunn-Norman, Shari (Missouri University of Science and Technology)
Abstract Lost circulation and problems related to drilling present a major challenge for the drilling industry. Each year, billions are spent to treat these problems. There is not a single solution to lost circulation because of the complexity and kind of formations susceptible to this issue. Lost circulation treatment data for the Shuaiba formation (induced fractured formation) were gathered from drilled wells in Southern Iraq (over 2000). Treatments have been grouped according to the volume of mud loss as complete, severe, and partial loss remedies. Detailed costs and probabilities calculations were conducted. The costs of three types of loss treatments (partial, severe, and complete) were handled separately since some treatments of severe, and all treatments of complete losses have to be introducing through open end drill pipe (OEDP). Expected monetary value (EMV) and decision tree analysis (DTA) were utilized to choose the optimal mud loss pathway to treat the lost circulation type. In this study, probability and cost were both considered to select the practical and efficient strategy of stopping mud loss. Too many of the remedy scenarios were investigated. The selection of the optimum strategy for every type of loss was based on the lowest EMV and efficiency. Once both conditions were satisfied, the treatment strategies were selected to treat each type of loss. Treatment strategies were provided for complete, severe, and partial losses as flowcharts that can be utilized as a reference in the field to stop or at least mitigate this troublesome problem. The methods used in this paper have the possibility to be adopted and invested to treat mud loss based on historical data of treatments in any formation worldwide.