During the last few years, the petroleum industry has been experiencing significant changes in various areas including, workforce, targets of exploration, application of (new) technologies, and general operational areas of focus. A prolonged depression of oil prices, changes in geopolitical atmosphere, the rise of investment in unconventional resources, as well as the implementation of emerging technologies (including digital) have been the primary catalysts of change within the industry. In terms of workforce, these changes have produced leaner organizations, along with the unintended consequence of losing some critical expertise and creating knowledge-gaps at many organizations. The changes, particularly in technology, necessitate a look at the need for the acquisition of new skills, for current and future petroleum engineers, that match new areas of interest – such as data analytics and artificial intelligence.
As the oil industry continues to evolve, it is imperative for academic organizations to consider these changing dynamics and be responsive. This paper outlines the results of a recent survey that targeted industry managers or supervisors who have direct experience with newly minted petroleum engineering graduates (less than five years of experience). The survey asked the participants their opinions regarding the preparedness of recent graduates as they enter the workforce. The survey's intent was to identify the potential need to modify the skills and knowledge currently acquired in academic institutions during the undergraduate study.
A comprehensive survey that posed questions regarding classical and contemplated new petroleum engineering curriculum was sent to recipients, primarily within the reserves and reservoir-engineering sector. The recipients were industry professionals working in operating, service, financial, and consulting sectors of the petroleum industry. More than 200 responses were received. The tabulated results are presented in the paper, along with interpretation of the results. The raw data will be made available through OnePetro as an accompaniment to the published paper.
The paper presents the survey conclusions, proposed action items, and discusses plans for a follow-up survey.
As data computing and big data driven analytics become more prevalent in a number of spatial industries, there is increasing need to quantify and communicate uncertainty with those data and resulting spatial analytical products. This has direct implication in oil & gas exploration and development where big data and data analytics continue to expand uses and applications of spatial and spatio-temporal data in the industry without providing for effective communication of spatial uncertainty. The result is that communications and inferences made using spatial data visuals lack crucial information about uncertainty and thus present a barrier to accurate and efficient decision making. With increasing cost awareness in oil & gas exploration and development, there is urgent need for methods and tools that help to objectively define and integrate uncertainty into business decisions.
To address this need, the Variable Grid Method (VGM) has been developed for simultaneous communication of both spatial patterns and trends and the uncertainty associated with data or their analyses. The VGM utilizes varying grid cell sizes to visually communicate and constrain the uncertainty, creating an integrated layer that can be used to visualize uncertainty associated with spatial, spatio-temporal data or data-driven products.
In this paper, we detail the VGM approach and demonstrate the utility of the VGM to intuitively quantify and provide cost-effective information about the relationship between uncertainty and spatial data. This allows trends of interest to be objectively investigated and target uncertainty criteria defined to drive optimal investment in improved subsurface definition. Examples are presented to show how the VGM can thus be used for efficient decision making in multiple applications including geological risk evaluation, as well as to optimize data acquisition in exploration and development.
Today, uncertainty, if it is provided at all, is generally communicated using multiple independent visuals, aggregated in final displays, or omitted altogether. The VGM provides a robust method for quantifying and representing uncertainty in spatial data analyses, offering key information about the analysis, but also associated risks, both of which are vital for making prudent business decisions in oil & gas exploration and development.
More than 400 students from 20 SPE student chapters across the US, Mexico, and Canada gathered at the annual SPE North America Student Symposium (NASS) held at Louisiana State University in Baton Rouge during 21–24 February. This year's symposium focused on data analytics and called upon a range of industry professionals to come together to discuss the challenges and opportunities of today's digital age. More than 25 SPE student members from 15 universities organized various events that honed in on the digital aspect of the oil and gas industry. Co-directors for the event included Abdul Homran (LSU), Adam Larson (Penn State University), and Bryan McDowell (Colorado School of Mines). Discussions at the symposium addressed the new wave of technology that's emerging in oil and gas--Big Data.
The San Joaquin Valley Section is proud to announce that Andrei Popa is the 2006 recipient of the SPE Outstanding Young Member Award for the Western North America Region. The Regional SPE Outstanding Young Member Award recognizes contributions to and leadership in public and community matters, the Society, the petroleum engineering profession, and/or the petroleum industry by SPE members under the age of 36. Popa was honored during the joint SPE/AAPG/GSA meeting in Anchorage at an awards luncheon held on 9 May. Popa is currently the Program Chairperson for the San Joaquin Valley SPE Section and previously has served as Secretary and Activities Director. He has specialized expertise in artificial intelligence and data-mining applications to the petroleum industry and has authored or coauthored 13 SPE papers on the subject.
The fourth industrial revolution, or Industry 4.0, has the potential to disrupt every industry, including the oil and gas industry through large-scale automation, robotics, artificial intelligence, and big data analytics. Young professionals (YPs) will be the main engine responsible for the development of many Industry 4.0 technologies in the oil and gas industry. This survey by the SPE Saudi Arabia section intends to gauge the pulse of YPs and their readiness to engage in it. The results of the survey will be discussed with executives from major companies during the executive panel titled "Energy Meets Intelligence" at the SPE-KSA Annual Technical Symposium and Exhibition to be held in April in Dammam, Saudi Arabia.
The Simplified Series, one of the most successful programs of the SPE Aberdeen Section young professionals (YPs) kicked off in September after a summer break with a presentation on Big Data and how to make sense of information and analytics in the oil and gas industry. Instead of focusing on revenue enhancement, Steven Rossiter, managing director at AgileTek focused his presentation on giving useful insights on how big data can be used in the industry to improve the safety, quality control, and the ability to forecast and move to evidence-supported decision-making instead of relying on the intuition of individuals. The event was organized and hosted by the SPE Aberdeen Section YP Committee and enjoyed a large turnover of 77 professionals. It took place during the SPE ENGenious Symposium–SPE's new global symposium and exhibition aimed at driving radical digital and technological change across the upstream oil and gas industry. The YPs had the opportunity to meet the 2018 SPE President Darcy Spady and SPE South, Central East Europe Director Jean-Marc Dumas.
Yesterday’s practices are being superseded by a universal trend towards the extensive use of historical and real-time data to understand, learn and predict all well intervention operations. This course explores the impact of data analytics on well operations. Drawn from the presenter’s extensive experience in data analysis, it examines, in easily understandable terms, today’s data management processes targeting process improvement.
Workplace safety is a main objective of any company working in the oil and gas business. The processes have been developed and established over the past decades based on individual experiences and causal pathways. The exhaustion of technical and administrative barriers has led to the introduction of behavioral safety. Recent advances in data technology and machine learning have disrupted many businesses and processes and can lead to a new paradigm in workplace safety as well.
In this case study we demonstrate the application of data science and predictive analytics to aid the HSE function and prevent accidents. We have analyzed operational and accident data from the past 10 years at a leading oil and gas company to quantify the effectiveness of their safety programs.
We have determined how many accidents each program actually prevents, and is able to prevent in an optimal setting. We have determined the optimal level of engagement for each program, and at what level diminishing returns set in.
We have further developed a predictive model to forecast the occurrence of accidents one month ahead of time. In this way the HSE function is able to focus on 15% of locations to control 69% of the accidents. The forecast was also able to predict accidents at locations where one would traditionally not expect accidents to happen, such as locations with low activity.
This paper shows the potential for improvement that is possible with the emerging big data, artificial intelligence and machine learning tools specifically in the field of workplace safety.
This article describes a practical approach to applying predictive analytics techniques against safety incident and near-miss data to generate actionable insights that change safety outcomes in the field. Examples illustrate three critical ways to use safety data: 1) predicting where incidents are most likely to occur, informing where to place additional resources and effort; 2) understanding the combinations of causes and sub-causes that are creating incidents, improving the focus of safety programs; and 3) revealing which proactive safety activities will best mitigate incident types predicted to occur, increasing the effectiveness of preventive measures. The authors discuss typical data and implementation challenges and encourage companies to stop waiting for "perfect" data and, instead, start applying predictive analytics to deliver targeted safety insights to supervisors and workers in the field. Are you ready to take the first step? According to the latest statistics published by Great Britain's Health and Safety Executive, the fatality rate has remained broadly flat across industries since 2012, claiming the lives of 144 workers in the UK during the 2017/2018 reporting period alone. When combined with 550,000 nonfatal injuries during the same time, it seems clear that current approaches to preventing occupational injuries are not working.
Life-cycle safety and integrity management of offshore structures is a critical activity owing to the adverse consequences of structural failure, ranging from loss of life and financial consequences to environmental pollution. Historically, integrity management of substructures such as jacket structures has been the subject of more detailed investigations than the integrity management of topside structures. For instance, more specific risk-and reliability-based methodologies exist for integrity assurance and planning the inspections of jacket structures than for topside structures. This article presents a practical methodology for risk-based inspection planning of large-scale topside structural systems under different limit states (ultimate, accidental, fatigue, and serviceability) and degradation mechanisms (e.g., corrosion and fatigue crack growth), with a view to data analytics and digitalization. The main advantage of the presented methodology is in its capability in systematically ranking the different structural elements/areas relative to one another based on their assessed level of risk of failure, i.e. a risk-based differentiation, and planning the inspections and repairs accordingly for large-scale structural systems. Such an integrated approach will result in efficient and economical management of offshore topside structural assets and can be used as a consistent and coherent basis for lifetime extension or decommissioning of offshore platforms. Integrity management of offshore structures is known to involve the analysis and management of large amounts of data and information over the lifetime of the structure. Therefore, insights are provided in the article regarding how the presented risk-based methodologies can be integrated into a digitalized and datadriven interface--a topic currently under heated investigation across the petroleum industry, facilitating the analysis and management of the involved data and information in an efficient and verifiable manner.