|Theme||Visible||Selectable||Appearance||Zoom Range (now: 0)|
Suarez-Rivera, Roberto (W. D. Von Gonten Laboratories) | Panse, Rohit (W. D. Von Gonten Laboratories) | Sovizi, Javad (Baker Hughes) | Dontsov, Egor (ResFrac Corporation) | LaReau, Heather (BP America Production Company, BPx Energy Inc.) | Suter, Kirke (BP America Production Company, BPx Energy Inc.) | Blose, Matthew (BP America Production Company, BPx Energy Inc.) | Hailu, Thomas (BP America Production Company, BPx Energy Inc.) | Koontz, Kyle (BP America Production Company, BPx Energy Inc.)
Abstract Predicting fracture behavior is important for well placement design and for optimizing multi-well development production. This requires the use of fracturing models that are calibrated to represent field measurements. However, because hydraulic fracture models include complex physics and uncertainties and have many variables defining these, the problem of calibrating modeling results with field responses is ill-posed. There are more model variables than can be changed than field observations to constrain these. It is always possible to find a calibrated model that reproduces the field data. However, the model is not unique and multiple matching solutions exist. The objective and scope of this work is to define a workflow for constraining these solutions and obtaining a more representative model for forecasting and optimization. We used field data from a multi-pad project in the Delaware play, with actual pump schedules, frac sequence, and time delays as used in the field, for all stages and all wells. We constructed a hydraulic fracturing model using high-confidence rock properties data and calibrated the model to field stimulation treatment data varying the two model variables with highest uncertainty: tectonic strain and average leak-off coefficient, while keeping all other model variables fixed. By reducing the number of adjusting model variables for calibration, we significantly lower the potential for over-fitting. Using an ultra-fast hydraulic fracturing simulator, we solved a global optimization problem to minimize the mismatch between the ISIPs and treatment pressures measured in the field and simulated by the model, for all the stages and all wells. This workflow helps us match the dominant ISIP trends in the field data and delivers higher confidence predictions in the regional stress. However, the uncertainty in the fracture geometry is still large. We also compared these results with traditional workflows that rely on selecting representative stages for calibration to field data. Results show that our workflow defines a better global optimum that best represents the behavior of all stages on all wells, and allows us to provide higher-confidence predictions of fracturing results for subsequent pads. We then used this higher confidence model to conduct sensitivity analysis for improving the well placement in subsequent pads and compared the results of the model predictions with the actual pad results.
Tsuchihashi, Naoki (University of Tokyo) | Wada, Ryota (University of Tokyo (Corresponding author) | Ozaki, Masahiko (email: firstname.lastname@example.org)) | Inoue, Tomoya (University of Tokyo) | Mopuri, Konda Reddy (Japan Agency for Marine-Earth Science and Technology) | Bilen, Hakan (University of Edinburgh) | Nishiyama, Tazuru (University of Edinburgh) | Fujita, Kazuhiro (Japan Petroleum Exploration Co., Ltd.) | Kusanagi, Kazuya (INPEX Corporation)
Summary A real-time stuck pipe prediction using the deep-learning approach is studied in this paper. Early signs of stuck pipe, hereinafter called stuck, are assumed to show common patterns in the monitored data set, and designing a data clip that well captures these features is critical for efficient prediction. With the valuable input from drilling engineers, we propose a 3D-convolutional neural network (CNN) approach with depth-domain data clip. The clip illustrates depth-domain data in 2D-histogram images with unique abstraction of the time domain. Thirty field well data prepared in multivariate time series are used in this study--20 for training and 10 for validation. The validation data include six stuck incidents, and the 3D-CNN model has successfully detected early signs of stuck in three cases before the occurrence. The portion of the data clip contributing to anomaly detection is indicated by gradient-weighted class activation map (grad-CAM), providing physical explanation of the black box model. We consider such explanation inevitable for the drilling engineers to interpret the signs for rational decision-making. Introduction Stuck is one of the major drilling problems that accounts for nonproductive time. In the past, one-third of the wells drilled in the Gulf of Mexico and the North Sea experience stuck pipe problems (Howard et al. 1994). Moreover, stuck may lead to wellbore abandonment in the worst scenario. Regarding its severity, there is a strong demand for early stuck sign-detection methods to avoid stuck incidents. The drillstring is considered to be stuck when it loses the freedom of movement; that is, it can neither rotate nor move up and down. Also, it cannot be retrieved from the wellbore due to external force. According to the mechanisms, stuck is roughly classified into three types, namely mechanical stuck, differential stuck, and geometrical stuck (Alshaikh et al. 2018). Each stuck type could be separated into smaller classes, for example, mechanical stuck includes both packoff and bridging.
Islamov, Rafael (Petronas Carigali Sdn Bhd) | Motaei, Eghbal (Petronas Carigali Sdn Bhd) | Madon, Bahrom (Petronas Carigali Sdn Bhd) | Abu Bakar, Khairul Azhar (Petronas Carigali Sdn Bhd) | Hamdan, Victor (Petronas Carigali Sdn Bhd) | W M Zani, Luqman (Petronas Carigali Sdn Bhd)
Abstract Dynamic Well Operating Envelop (WOE) allows to ensure that well is maintained and operated within design limits and operated in the safe, stable and profitable way. WOE covers the Well Integrity, Reservoir constraints and Facility limitations and visualizes them on well performance chart (Hamzat et al., 2013). Design and operating limits (such as upper and lower completion/facilities design pressures, sand failure, erosion limitations, reservoir management related limitations etc) are identified and translated into two-dimensional WOE (pressure vs. flowrate) to ensure maximum range of operating conditions that represents safe and reliable operation are covered. VLP/IPR performance curves were incorporated based on latest Validated Well Model. Optimum well operating window represents the maximum range of operating conditions within the Reservoir constraints assessed. By introducing actual Well Performance data the optimisation opportunities such as production/injection enhancement identified. During generating the Well Operating Envelops tremendous work being done to rectify challenges such as: most static data (i.e. design and reservoir limitations) are not digitized, unreliable real-time/dynamic data flow (i.e. FTHP, Oil/Gas rates etc), disintegrated and unreliable well Models and no solid workflows for Flow assurance. As a pre-requisite the workflows being developed to make data tidy i.e.ready and right, and Well Model inputs being integrated to build updated Well Models. Successful WOE prototype is generated for natural and artificially lifted Oil and Gas wells. Optimisation opportunities being identified (i.e. flowline pressure reduction, reservoir stimulation and bean-up) Proactive maintenance is made possible through dynamic WOE as a real time exceptional based surveillance (EBS) tool which is allowing Asset engineers to conduct the well performance monitoring, and maintain it within safe, stable and profitable window. Additionally, it allows to track all Production Enhancement jobs and seamless forecasting for new opportunities.
Abstract. In this paper an approach to mimic the influence of appendages on the pressure distribution on a boat’s hull in RANS simulations is given. While, of course, the appendages could be modelled explicitly in the RANS simulation, this significantly increases the cell count and CPU-time requirements of the simulations, particularly for boats with multiple appendages. In this approach it is assumed that the pressure fields generated by the appendages can be decomposed into two parts: one related to lift (asymmetric) and one related to the displaced volume (symmetric). For these parts actuator line momentum theory is utilized, and doublet mass sources are described based on potential flow theory. An initial assessment of the approach’s capabilities and accuracy is presented based on the SYRF wide light series (Claughton, 2015), showing good promise. An application example with particular focus on the reduction of CPU-time requirement is given based on a boat fitted with canting keel and DSS foil.
Abstract Traditional well planning and drilling risk management requires revisiting past experiences for a given drilling volume and it is expected that new well project evaluation requires multidisciplinary teams and plurality of data usage and access. To date, there is little integration of drilling data and mud logs into decisional data space used for well planning. Integration of drilling data and mud logs into models used for well planning allows for the data to be analyzed and used for evaluation in order to answer questions such as: will we encounter stringers? What is the expected ROP during the project? How can I de- risk the project in planning phase? We have developed a methodology to allow well planning and associated subsurface models to be created in geo-referenced drilling data space. At the heart, the method requires to build the static subsurface model using historical well trajectory, drilling data and mud log data. From this point of view the subsurface mathematical data space becomes pure drilling space where first the drilling data will populate the virtual space coordinates, allowing all otherwise standard 3D interpolation and intersect techniques to be used. Our approach provides flexibility and completeness for the data distribution as all the wells have past recorded drilling data while other sources of data may be very sparse such in the case of petrophysical or seismic data. We have tested this approach on data from one field and the methodology is robust for using historical data to populate the model and plan new wells. Using driller definitions, we have built an expedite method to detect stringers and match the new observations with sub-seismic events and petrophysical evaluations. The potential application of this method beyond what we have tested is the possibility for scale up and perform rapid comparative exercise with other information and data sets. Sources of uncertainty and possible propagation of such through drilling data space is briefly introduced. Having a historical database of drilling data in geo-referenced space gives confidence to well planning, helps the drilling team to get updated drilling information and perform risk management prior to drilling.
This article, written by JPT Technology Editor Chris Carpenter, contains highlights of paper IPTC 19714, “Deep Learning for Monitoring Well-Testing Burner Efficiency,” by Darine Mansour, SPE, Hakim Arabi, and Yingwei Yu, SPE, Schlumberger, et al., prepared for the 2020 International Petroleum Technology Conference, Dhahran, Saudi Arabia, 13-15 January. The paper has not been peer reviewed. Copyright 2020 International Petroleum Technology Conference. Reproduced by permission. During well-testing operations, incorrect burner combustion could have an adverse effect on the environment or personal safety. Combustion efficiency is assessed by personnel who observe the flame. This practice lacks consistency and poses challenges, including environmental and safety considerations and issues with data. In the complete paper, the authors propose a solution that uses a deep neural network that learns from flame videos to define the quality of the combustion. The results are promising and show that this solution is a good candidate for real-time burner-efficiency monitoring and automatic alarm triggering and optimization. Introduction Automation of the evaluation of combustion efficiency would promote consistent combustion efficiency and timely reaction to undesirable combustion. In field operations, cameras are positioned for a direct view of the burner flames. The authors use a data set of video images captured by the cameras and labeled by domain experts as images of acceptable or undesirable combustion. The goal of the study is to automate human monitoring of the combustion by processing these images and classifying them automatically as images of acceptable or undesirable combustion. A supervised learning approach is applied to the problem. The approach consists of extracting the flame features from each image and feeding the features to a classifier that categorizes the combustion. The classifier is trained using the extracted features and labeled combustion images as input. The traditional approach to feature extraction is manual feature engineering, involving the extraction of handcrafted features using image-processing algorithms and domain knowledge. The process is tedious and time-consuming. Much effort is required to extract all relevant features in various contextual and environmental conditions. The extracted features are problem-dependent, and the process may require revisiting algorithms to adapt to new data that hold new features or new representations of existing features. Approach To achieve classification, a convolutional neural network (CNN) is trained using a data set of images labeled by domain experts. Using the image data set captured by well-testing cameras, a CNN with an architecture similar to VGG-16 that learns to extract powerful features from natural images is fine-tuned.
In this paper, the authors present an open-source tool kit for the generation of microfabricated transparent models of porous media (micromodels) from image data sets using optically transparent 3D polymer additive manufacturing (3D printing or sintering). These micromodels serve as research and pedagogical tools that facilitate the direct visualization of drainage and imbibition within quasi-2D porous media, generated from a range of image modalities [e.g., thin section micrographs, µ-computed tomography (μCT) orthoslices, and conventional digital photography]. Though recent advances in 3D X-ray imaging, such as X-ray microtomography and µCT, permit volumetric imaging of microcore flood experiments within geological samples at the pore scale, experimental observations of dynamic (time-resolved) multiphase flow within pore networks still are obtained conventionally using transparent etched or molded synthetic porous media commonly referred to as micromodels. Typically, video footage of fluid imbibition and drainage experiments conducted across these quasi-2D pore networks is used to understand fluid distributions and displacement mechanisms within an equivalent 3D porous media. Contrary to state-of-the-art dynamic µCT coreflood experiments, which require synchrotron beam time to be conducted, micromodel studies can be undertaken routinely within a laboratory-based setting with a relatively simple experimental setup.
The opinion expressed in the article are the author's opinion and not the opinion of Dow Chemicals. Inclusion and diversity have gone beyond buzzwords to becoming an existential basis for every major company and organization in this present climate. Equality for all, even as outlined in the US constitution, is both virtue and value to every progressive establishment that seeks to be relevant and have a prosperous future. According to a Pew Research Center study, among the millennial and Gen Z generations who serve as the primary recruiting pool of the STEM workforce in coming years, diversity and inclusion is a key requirement when choosing to accept employment offers. With the International Women's Day held annually on 8 March, one of the more obvious issues in this space is gender inclusivity as demonstrated in the disparity statistics.
Hybrid steam/solvent processes have gained importance as a thermal-recovery process for heavy oils in recent years. Among the identified physical mechanisms that play a role during these processes are heat-transfer phenomena, gravity drainage and viscous flow, solvent mass transfer, and mass-diffusion/-dispersion phenomena. In this paper, a study of sensitivity to grid size is described. Ideally, this work will provide some insight into methodological aspects to be considered when hybrid steam/solvent processes are modeled. Recent studies concerning the size of the liquid-solvent-rich zone where molecular diffusion and dispersion occur have implied that a detailed representation of the solvent/steam-chamber edge is necessary in the numerical model.
The difficulty in applying traditional reservoir-simulation and -modeling techniques for unconventional-reservoir forecasting makes the use of statistical and modern machine-learning techniques a relevant proposition. However, the most current applications of these techniques often ignore the systematic time variations in production-decline rates. This paper proposes a nonparametric statistical approach, using a modern technique termed functional data analysis (FDA). In FDA, production data are modeled as a time series composed of a sum of weighted smooth analytical basis functions. Many companies have adopted a so-called "data-centric process" for understanding and forecasting in unconventional reservoirs. This data-centric process comes as a consequence of the shortcomings of conventional reservoir-data-analysis and -modeling approaches, which mostly belong to the preshale era.