Yesterday’s practices are being superseded by a universal trend towards the extensive use of historical and real-time data to understand, learn and predict all well intervention operations. This course explores the impact of data analytics on well operations. Drawn from the presenter’s extensive experience in data analysis, it examines, in easily understandable terms, today’s data management processes targeting process improvement.
A Digital Twin is a software representation of a facility which can be used to understand, predict, and optimize performance to help to achieve top performance and recover future operational losses. The Digital twin consists of three components: a process model, a set of control algorithms, and knowledge.
Usually the time for commissioning a project exceeds the initial estimations, therefore delays in project completion are quite common. This is often because ICSS testing is done on a static system which does not account for how the system will react dynamically to certain scenarios such as start-ups and shutdowns. Issues such as configuration errors, loop behaviors, start-up over-rides, dead-lock inter-trips and sequence logic are difficult to predict and are impossible to anticipate during static testing. Such delays lead to higher costs and therefore reduced revenue.
This paper aims to describe the most innovative approach to Project & Operational Certainty, which addresses these issues by using a Digital Twin for commissioning support and training. One successful use of this approach was in the Culzean project, an ultra-high-pressure high temperature (UHP/HT) gas condensate development in the UK sector of the Central North Sea. A high-fidelity process model was built and fitted to the actual plant performance based on equipment data sheets. This was connected to ICSS database and graphics, offering a realistic environment, very close to the one offshore, which had the same look and feel for the operators.
Dynamic tests conducted on the Digital Twin predicted issues on the real system, which enabled potential solutions to be tested, leading to a significant decrease in the time spent and cost during commissioning. All the operating procedures were dynamically tested, which enabled us to correct errors, saving time before First Gas. Additionally, all CRO (Control Room Operators) and field technicians were trained and made familiar with the system months in advance, aiming to avoid future unnecessary trips during First Gas.
Finally, all the control loops were fine tuned in the Digital Twin and parameters were passed to off shore, to be used as first starting point. It is expected that these parameters will be very close to fine operational points, as the model used is high fidelity model and very close to real system offshore.
In 2016 BP adopted a technology plan to investigate how efficiencies could be realized in the inspection area. The project termed UWIP (Under Water Inspection Program) was divided into two areas: Alternative inspection technology, Advanced inspection technology.
Alternative Inspection technology addresses the configuration of existing technology to deliver efficiencies
Advanced Inspection technology looks to near future opportunities that may be realized within a 5-year period.
This presentation primarily addresses the Alternative agenda, with focus on how the configuring of sensor packages onboard a variety of underwater vehicles has delivered data up to 8 times faster than traditional inspection methodologies. Termed FDII (Fast Digital Imaging Inspection) the concept aims to replace video with Laser / Stills and contact Cathodic Potential systems with Field Gradient.
The Advanced agenda presents BP progress in delivering unmanned, automated Unmanned Surface and Underwater Vehicle Systems into Inspection programs.
BP has undertaken three FDII campaigns, 2017/18 in North Sea and 2018 Trinidad, inspecting 825 pipeline kilometers. There are another two FDII programs scheduled in North Sea and Caspian regions in 2019. Data acquisition has significantly increased; however, data management techniques have had to be reviewed and adapted. Inspection and integrity contractors expect to receive data in traditional formats and their systems (as well as operators) are not configured to receive and interpret the new FDII data. Additionally, software houses are also behind the curve in allowing users to host and deliver to stakeholders.
FDII facilitates rapid data acquisition and operational teams are ready to grab credit for efficient execution. But data bottlenecks in editing, eventing and delivering data to stakeholders have removed some of the ‘shine’ from the project. For FDII to develop a step change is required in the data management.
FDII is a technique, it is not an inspection criterion. FDII lends itself to Fast ROV and AUV underwater vehicle developments which are also linked to operation from Unmanned Surface Vessels. BP has a stated goal that by 2025 all inspections will performed from unmanned systems. FDII is a technology that progresses us to that goal.
The realization that fossil fuels are a limited resource, and the growing awareness of the negative impact their emissions have on the planet, has impacted every oil and gas major. The global challenge is expressed in the "energy trilemma" of: Enough Energy, Affordable Energy and Sustainable Energy.
The industry must adapt, in terms of cost and environmental footprint. In this paper we discuss how digitalization and renewable sources can drive innovation to meet these challenges.
We will use current long-range forecasts to understand how the global energy mix is expected to change over time, and illustrate how different scenarios are likely to affect the offshore industry. We also study how digitalization and hybridization with technologies like offshore wind and power-from shore, can reduce costs, energy consumption and emissions.
There are many trends accelerating the introduction of new energy sources These include: Global population growth and changing dynamics: "Millennials" bring with them their own expectations about technology, the pace of work and accountability. Equally influential, is the challenge to feed and power the 2 billion poorest and the extra 2 billion people expected by 2050. Transportation changes: Road, aviation and shipping account for more than 60 percent of the world's oil consumption and key to limiting the impact on the climate. Energy generation revolution: The grid needs to cope with the increased power demands and to incorporate and expand the contribution of renewables Rise in distributed generation: Hybridization pilot projects to use offshore wind turbines to power e.g. water injection systems.
Global population growth and changing dynamics: "Millennials" bring with them their own expectations about technology, the pace of work and accountability. Equally influential, is the challenge to feed and power the 2 billion poorest and the extra 2 billion people expected by 2050.
Transportation changes: Road, aviation and shipping account for more than 60 percent of the world's oil consumption and key to limiting the impact on the climate.
Energy generation revolution: The grid needs to cope with the increased power demands and to incorporate and expand the contribution of renewables
Rise in distributed generation: Hybridization pilot projects to use offshore wind turbines to power e.g. water injection systems.
There are a range of technologies described, which will provide the transformational step change to enable companies to transition into the broader energy ecosystem. However, the real game changer lies in integrating these technologies in a way that drives the evolution from connected operations, to collaborative operations and ultimately autonomous operations to achieve maximum value.
We will describe how, by properly using digital technologies, the sector can not only reduce capital and operating expenditures by up to 30 percent but also use energy optimization and hybridization with renewable energy sources to reduce emissions and help oil and gas operators do their part in addressing "The Energy Trilemma".
No matter what industry or activity, when a human is performing a task, there is a possibility that the person carrying out that task could make an error. There are numerous studies showing the contribution of poor procedures towards human error, which led to an incident, ranges between 65-90%. Process Safety Management legislation such as Seveso III and OSHA 1910.119, require the use of procedures when executing safety critical tasks and as such regulators have recognized the importance of having a set of good quality procedures as part of the management of human factors.
As companies begin to embrace the concepts of digitalization and big data, the main challenge still remains … ‘how do we make a step change in reducing human error in heavily paper based operating and maintenance procedures?’
This paper will provide examples of how poor procedures have led to human error causing across industry incidents, introduce the background to human factors with respect to procedures and explain some of human error categories to which people are susceptible. The paper will then explain the road map approach that the UK regulator (UK Health and Safety Executive) has adopted as part of their Human Factors Delivery Guide. The paper then shows how the energy industry's approach return these procedures back into a paper format, fails to take advantage of available digital technologies to make the step changes in reducing human error.
This paper shows that incidents continue to occur in all industries due to human error in procedures and shows how the drive from the regulator to perform Critical Tasks Analysis can actually lead to procedures becoming less useable (f these reviews are not performed correctly). The paper will then show how taking a digital approach to meeting these new regulatory requirements provides the opportunity to digitize existing operating and maintenance procedures, enabling a structured, efficient and auditable approach to theses assessments. The paper will also show how the adoption of the available digital technologies provide new performance influencing techniques that are not available in paper-based systems.
The paper will also show how emerging technologies such as Augmented Reality can further enable the transition to these new technologies and how big data, can provide additional continuous improvements in procedures, ensure appropriate competencies are in place for field workers performing tasks and also introduce significant efficiencies to lower operation costs.
Human error continues to contribute significantly to incidents in the energy and other industries. To address this, regulators, such as the UK Health and Safety Executive, are placing new requirements on operating companies to ensure the risks associated with errors in procedures are managed more effectively. The opportunity to make a step change in reducing human error, whilst also providing an efficient work flow, will lead to safer working environments, reduce potential impacts on the environment and also provide efficiencies for operation and maintenance teams, which will lead to savings in Operational Expenditure.
With the most recent industry downturn still fresh in many minds, the oil and gas E&P sector is approaching this recovery with a commitment to long-term cost discipline. As a result, augmented reality (AR) and virtual reality (VR) technologies are being adopted by operators and service companies alike as a means of cost savings while driving operational efficiency.
AR technologies employ enhanced visualization hardware, techniques, and methodologies to create new environments wherein digital and physical objects and their data coexist and interact with one another, enhancing the user experience of the real world (
Until recently, these technologies were primarily applied as enhanced entertainment products, most notably within the gaming industry. However, during the past several years, and thanks to the introduction of hands-free, head-mounted display (HMD) technologies, such as Microsoft® HoloLens™ and now HoloLens 2, AR and VR are migrating into the enterprise sector.
While the oil field has not been as quick to integrate AR and VR as other sectors, such as medicine, defense, and aeronautics, operators and service providers alike have increased adoption overthe past 12 months. Motivated by a mandate to keep operating costs low and improve efficiencies in terms of field processes, operators have begun implementing AR/VR applications as collaborative problem-solving, planning, and design tools.
For example, some operators are initiating ARconcepts to promote internal use development and prototyping for both oilfield applications and remote refinery inspections. Additionally, service companies are embracing the use of smart glasses and wearable technologies to help improve remote work and collaboration to help increase in-field safety and reduce downtime.
As part of its strategy to help drive the oil and gas industry's digital transformation, one major service provider is developing AR/VR applications to create digital representations of physical oilfield assets on the Microsoft® HoloLens device. One area of focus is the planning, design, and deployment of solids control, fluid separation, and handling technologies for offshore drilling applications.
Innovation is critical to the future success of the oil and gas industry (
As a way of addressing this, the TechX programme at the Oil & Gas Technology Centre has launched TechX Ventures in July 2018 – a partnership with Deep Science Ventures (DSV) – that combines deep science with engineering to create the next generation of start-up companies with technologies that will position the oil and gas industry for a sustainable future in a low carbon economy.
The start of the programme was a workshop held with industry, academia and the scientific community, to identify areas where new thinking and technology could open up significant opportunities. Three challenge themes were developed, each of which became an opportunity areas for DSV to address. These are:
As part of the TechX Ventures programme, DSV recruited thirty scientists and engineering experts from across the world to tackle the opportunity areas and at the end of the nine-month programme a total of six new start-up companies with new intellectual property were created and invested in by DSV. Of these six, two were selected to join the coveted TechX Pioneer accelerator programme run by OGTC in Aberdeen. These companies are called Eltera and Optic Earth.
Birnie, Claire Emma (Equinor ASA) | Sampson, Jennifer (Equinor ASA) | Sjaastad, Eivind (Equinor ASA) | Johansen, Bjarte (Equinor ASA) | Obrestad, Lars Egil (Equinor ASA) | Larsen, Ronny (Equinor ASA) | Khamassi, Ahmed (Equinor UK LTD)
To ensure safe and efficient operations, all offshore operations follow a plan devised to take into account current operation conditions and identify the optimum workflow with the minimum risk potential. Previously, planners had to manually consult eight data sources, each with a separate UI, and summarise the plan in a.pdf document. Equinor's Operation Planning Tool (OPT) has been developed to easily present the planners with the technical conditions of a platform, identify potentially dangerous combinations of concurrent activities, and propose learnings from eight years’ worth of incident recordings – all relevant to the current list of planned activities. The tool aims to answer questions such as ‘are other activities planned for the same time which would make this activity unsafe?’ or ‘have incidents previously occurred whilst performing similar tasks on this equipment type?’.
This paper details the development of the OPT with a particular focus on the application of Natural Language Understanding for extracting equipment types and tasks involved in previous incidents and relating these to planned activities. Utilising natural language processing techniques, a system has been developed that mines the content of Equinor's incident database, and assigns context to incidents, by identifying the systems, activities and equipment involved and the conditions on the asset at the time of the incident. The same context is also discovered from the content of planned activities. These key concepts are organised into a knowledge graph synthesising Equinor's institutional safety and operational experience.
The OPT has reduced time spent planning by providing a single interface detailing a plant's technical conditions, all planned work orders and relevant lessons learned from previous incidents. By reducing the reliance on personal experience, the tool has provided subjectively improved risk identification and handling, plus faster knowledge transfer to new employees as well as focussed cross-platform knowledge sharing. The success of the tool highlights the strength of combining data and leveraging the vast quantities of historic data available both in unstructured and structured forms to create a safe, offshore work environment.
This paper's focus is the advocation of utilising diagnostic data available from digital field devices to help reduce operating costs for end users.
In recent years companies across multiple industrial sectors have invested in improving their understanding of both the historical and live data they produce. The source of the data is specific to the processes but the objective for all remains the same - to use statistical techniques to develop a toolset that can be used to predict performance based on live and historical data.
For the oil and gas industry, the continued adoption of digital device transmitters has increased the volume of data available from instruments such as flow meters, temperature probes and pressure sensors. Typically, this additional data provides information on the integrity or quality of the associated device. However, with the appropriate level of facility and instrument knowledge it is also possible to infer information with respect to the process stream.
Furthermore, this data, if correctly interpreted, can be used to predict maintenance and calibration requirements, resulting in reduced staff effort and shutdowns. The need for physical intervention due to device failure is also reduced, which in turn minimises the potential for accidental hydrocarbon release when a device is removed for repair or replacement.
NEL are currently undertaking research projects with the primary objective of developing definitive correlations between process effects, meter condition and diagnostic data response. The paper provides details of said research, with particular reference to the data science and mathematical techniques currently being trialed for the analysis stage. The techniques, when fully developed, will be metering technology specific and therefore offer a level of insight to end users on facility and meter performance which is not currently available in industry. The toolsets developed will in turn provide the end users with the knowledge and confidence to make cost saving decisions with respect to planned maintenance as well as improving facility efficiency through a more comprehensive understanding of their own data sets.
Several mature fields in the North Sea experience significant challenges relating to high pressures and temperatures accompanied with the infill drilling challenge of very narrow margins between pore and fracture pressures. To navigate these narrow mud weight windows, it is critical to understand the bottom hole pressure. However, in the cases of fractured formations above the target zones, severe losses can be encountered during drilling and cementing operations often leading to the inability to maintain a full mud column at all times and even threaten the ability to reach TD.
The operator therefore decided to investigate the use of a new acoustic telemetry system that could provide internal and external pressure measurements, (along with other downhole measurements) independently of traditional mud pulse telemetry in the drilling assembly. Real-time distributed pressure data essential to understanding the downhole conditions could therefore be provided regardless of circulation, even under severe losses or during tripping and cementing operations.
This acoustic telemetry network was deployed on several wells through multiple hole sizes and including losses management, liner running and cementing operations.
The initial primary purpose of running the network was the ability to monitor the top of the mud at all times, even in significant loss situations. As real-time data was acquired it became apparent that the data could also be used in real-time to aid and help quantify the actual downhole pressures. The use of this downhole data was modified and new calculations designed for simpler visualization of equivalent circulating densities at the shoe, bit and identified weak zones in the well at depths beyond the acoustic tools themselves. This data was used to manage the bottom hole pressure within a 300 psi mud weight window to ultimately enable the well to be delivered to planned TD.
The tool and calculations helped verify managed pressure connections and subsequent pump ramp up and down operations to minimize pressure fluctuations in the well. Additionally the data was used during dynamic formation integrity testing and to measure and calculate ECD at various positions along the drillstring and casing when downhole PWD measurements were unavailable.
This paper will describe how the implementation of new technology through the downhole acoustic network was deployed and the lessons learned in how the real-time data was used, changed and adapted in this particular well. Due to this deployment the acoustic telemetry network will now be used on upcoming equally challenging wells and its range of operations expanded to include drilling, tripping and liner cementing operations.