The complete paper discusses a study in which the authors propose a joint field-development and well-control-optimization work flow using high-performance parallel simulation and commercial cloud computing. In upstream oil and gas, cloud computing is very immature because the industry has always been challenged by storage and computational capability. However, high-performance cloud computing may create an opportunity for smaller companies lacking infrastructure for scientific applications.
Researchers at the University of Massachusetts, Amherst, performed a life-cycle assessment for training several common large AI models. They found that the process can emit more than 626,000 lbm of carbon dioxide equivalent—nearly five times the lifetime emissions of the average American car. Southwest Research Institute is working to improve the accuracy of pipeline leak detection using sensors, artificial intelligence, and deep learning.
This course discusses the fundamental sand control considerations involved in completing a well and introduces the various sand control techniques commonly used across the industry, including standalone screens, gravel packs, high rate water packs and frac-packs. It requires only a basic understanding of oilfield operations and is intended for drilling, completion and production personnel with some sand control experience who are looking to gain a better understanding of each technique’s advantages, limitations and application window for use in their upcoming completions.
Yesterday’s practices are being superseded by a universal trend towards the extensive use of historical and real-time data to understand, learn and predict all well intervention operations. This course explores the impact of data analytics on well operations. Drawn from the presenter’s extensive experience in data analysis, it examines, in easily understandable terms, today’s data management processes targeting process improvement.
This paper will focus on the application of lithium-ion energy storage solutions (ESS) for offshore oil and gas (O&G) installations. It will discuss the benefits that can be achieved by integrating energy storage in hybrid power plants, using the West Mira semisubmersible installation in the North Sea as a representative case study. West Mira will be the world's first modern drilling rig to operate a low-emission hybrid (dieselelectric) power plant using lithium-ion batteries. The integration of energy storage with the power supply and distribution system of a drilling rig represents an important step towards improving the environmental sustainability of the offshore oil and gas industry by reducing emissions and paving the way to harnessing clean but intermittent renewables, such as offshore wind. Offshore rigs have highly variable power consumption for drilling and dynamic positioning. By incorporating energy storage, it is possible to reduce the runtime of combustion engines and also keep them operating on an optimized combustion level. The installation of an ESS on West Mira will result in an estimated 42% reduction in the runtime of on-platform diesel engines, reducing CO2 emissions by 15 percent and NOx emissions by 12 percent, which is equivalent to annual emissions from approximately 10,000 automobiles. The batteries on West Mira will be charged from the rig's diesel-electric generators and used for supplying power during peak load times. In addition, they will serve as backup to prevent blackout situations and provide power to the thrusters in the unlikely event of loss of all running machinery.
Video images have traditionally provided intuitive visual analysis in a wide range of wellbore diagnostic situations. Step changes in computer vision techniques and image processing have led to the ability to make measurements from images (visual analytics). This paper demonstrates several applications where the application of this new data analytics source, combined with state-of-the-art acquisition technology, have further improved understanding of complex well issues while reducing operational time, risk and cost. Examples include hydraulic fracturing, well integrity, erosion, restrictions and leaks. The paper will describe the methods and process of this visual analytics technique through discussion of the three main work flow stages from data acquisition to final analytical product, including the innovative developments in sensor, system and computer vision applications that support each step: 1. Acquisition of full circumferential, depth-synchronized video data of the wellbore. An array of four orthogonally positioned cameras, pointing directly at the pipe wall, concurrently record overlapping images, enabling a continuous full-well video dataset to be obtained.
Makwashi, Nura (Division of Chemical and Petroleum Engineering, London South Bank University) | Barros, Delcia Soraia David (Division of Chemical and Petroleum Engineering, London South Bank University) | Sarkodie, Kwame (Division of Chemical and Petroleum Engineering, London South Bank University) | Zhao, Donglin (Division of Chemical and Petroleum Engineering, London South Bank University) | Diaz, Pedro A. (Division of Chemical and Petroleum Engineering, London South Bank University)
Production, transportation and storage of highly waxy crude oil is very challenging. This is because they are usually characterised by high content of macro-crystalline waxes, predominantly consisting of n-alkanes (C18 to C36) that which could cause costly deposition within the wellbore and production equipment. The accumulation of deposited wax can decrease oil production rates, cause equipment breakdown, and clog the transport and storage facilities. Currently, different polymeric inhibitors have been utilised in the oil and gas field to mitigate and prevent wax deposition. However, as of today, there is no distinctive wax inhibitor that could work effectively for all oil fields. One of the objectives of this work is to study the efficacy of a blended commercial wax inhibitor - pour point depressant on wax deposition mitigation in a flow rig designed with 0 and 45-degree bends in the pipeline.
Standard laboratory techniques using high-temperature gas chromatography (HTGC), rheometer rig, polarized microscope and elution chromatography were employed to obtain n-paraffin distribution, oil viscosity, WAT, pour point and SARA fractions. Series of experimentation were carried out with and without the inhibitor in a straight pipe test section. The severity of wax deposition in the pipeline built-in with a 45-degree bend is compared with a straight pipe. The blended inhibitor was tested at concentrations of 500, 1000, and 1500-ppm, under laminar and turbulent conditions. The crude oil sample was found to be naturally waxy with wax content of 19.75wt%, n-paraffin distributions ranges from C15-C74, WAT and pour point of 30°C and 25°C respectively. The severity of wax deposition in the test section is 43% higher in 45-degree bend compared to straight pipe. However, the severity of the deposition was reduced to 12.3% at extremely low temperature and flow rate. Nonetheless, better inhibition performance was achieved at 25 and 30°C. The wax thickness was reduced from
There are significant trends in the reduction of traditional 2D design, this is being replaced by the sole development of 3D models. This paper will detail how to develop algorithms to automate large aspects of a design review. These techniques significantly increase efficiency, ensure constancy and optimise the accuracy of the design, leading to reduced project costs.
Utilising the 3D models enriched metadata and by developing independent algorithms, it is possible to create a cyberphysical model that enables automation of the design review. For example; using the geometrical data in the 3D model to check a hazard with respect to a detector, confirming that the detector is located close to the hazard. There are multiple checks similar this example, cataloguing and scripting these checks can be managed within PLM software.
Using algorithmic automation techniques reduces the overall design hours of a project, it checks the consistency of the design. Getting it right first time reduces the number of changes later in the project lifecycle, avoiding expensive rework costs. During the first phase of this initiative, we have found, that automation leads to a reduction of design hours by 10% and increases the accuracy and consistency of the design review.
This first phase of automation uses the metadata in the 3D model, where the output from the check leads to a comment on the design. To scale the pilotm which will encompass the inclusion of other data sources, will further enrich the cyberphysical model. Ultimately, by creating a decisions database and using Artificial Intelligence we will be able to close the loop, which will lead to a design that is fully evaluated before it leaves the designer. It is also possible to automate in other phases of the project lifecycle, where image recognition will compare the real asset to the model.
This level of automation is unique, there are other low-level forms of automation, but the advancements of this technology has, to our knowledge, not been attempted in the Oil and Gas sector. The development and scaling of this technology is novel and will have a significant impact on the way future projects are executed.
Offshore oil and gas installations are (by their nature) located in remote locations that are both difficult and costly to access. While such challenges exist, the operate & maintain requirements associated with such assets are consistent and must be addressed, requiring operators to identify the most efficient form of service to reduce staffing levels, risk and cost.
Offshore hydrocarbon production assets commonly incorporate equipment and processes that can lead to significant (fugitive) gas emissions. The consequences are both economic and social (environmental) in nature, requiring operators to perform emissions surveys with the objective of leak identification and remediation within the shortest possible timeframe. The frequency of this activity is naturally limited and must be balanced with the staffing and operating needs of the broader facility, which in-turn can lead to sub-optimal leak detection to fix timing and reliability.
Addressing the three key challenges of access productivity, detection reliability and results quantification, Worley has developed a remote sensing platform that incorporates the use of productive remote access equipment such as unmanned aerial vehicles (UAV) and in-situ monitoring, with machine based emissions detection and algorithmic quantification to provide a solution that allows the operator to increase survey frequency, obtain more reliable results at lower cost, and perform the work in a manner consistent with safe and low-risk operations.
In both testing and field deployments, the results have provided for significant reductions in both false positive and negatives and have produced datasets that allow for accurate indications of greenhouse gas reduction via comparison of volumetric emissions before and after leak repair activity has taken place.
The technology is largely mathematical, utilizing coded routines for machine learning to perform gas detection under (initially) supervised modeling conditions, and algorithmic gas dispersion models for further emission quantification. The performance of the survey is typically carried out through the integration of existing, proven manufactured sensing equipment across several types of UAV or in-situ monitors which collect field data for transmission to a cloud-based portal which further processes the results.
The approach has been shown effective in accessing hard or costly to reach areas, improving survey productivities, while the data processing and quantification allows the operator to benefit from improved measurability and prioritize leak repair accordingly.
This paper explains how half a billion hours of service data for pressure safety valves (PSVs) can be analysed and presented to allow optimisation of maintenance intervals and manage the inspection of PSVs. The large amount of data creates challenges, but also opportunities for an enhanced methodology expandable to other equipment types.
A methodology has been developed that uniquely combines qualitative and quantitative analysis. The latter ensures that the risk from PSV failure is determined and below set criteria. The drawback with this criterion alone would be that it is based on average data from a large set of PSVs and the average may not be applicable everywhere especially if failures are not random, but have an underlying, potentially unknown, cause. Therefore, each PSV is also qualitatively assessed. To bridge the gap between individual PSVs and "large sets", groups of PSVs are also identified in the methodology and data for them is collated and used within the whole group; the careful analysis required to define the groups, which must have similar properties, or performance, is described. This multi-layered assessment gains the most information possible from the data. The key part of the process is also to present this data for review and analysis, and this is achieved through a digital, cloud-based interactive dashboard.
The analysis has shown that maintenance intervals can be reduced significantly, but simultaneously risk reduced by concentrating effort on the worst-performing PSVs. Not least, a dashboard presentation of the risk-based inspection (RBI) showing the calculated inspection interval, changes to the intervals and failures allows a clear picture to be developed of PSV performance. Maintenance planning also becomes easier and information required for deferral assessments is available in seconds rather than hours. The analysis shows where poorer performance can occur, this is often applicable across different assets. The way in which the approach can be expanded to other equipment types will be described.
The novel approach in the assessment is the multi-layered combination of qualitative and quantitative analysis and the presentation of a large amount of data through the cloud to be used by maintenance teams, technical authorities and operators. It also shows the benefits of collecting half a billion service hours of data and that this need not be an onerous task.