GE's DPI 620 Genii combines comprehensive, multifunctional test and calibration capabilities with digital HART and Foundation Fieldbus communication in a single handheld device. It is the first such portable device to feature intuitive swipe screen technology, the company said. It integrates with calibration and maintenance software so that calibration and configuration data are available from the field devices through enterprise resource planning.
Piazza, Ralph (Petrobras) | Vieira, Alexandre (Petrobras) | Sacorague, Luiz Alexandre (Petrobras) | Jones, Christopher (Halliburton) | Dai, Bin (Halliburton) | Price, Jimmy (Halliburton) | Pearl, Megan (Halliburton) | Aguiar, Helen (Halliburton)
This paper presents a new optical sensor configuration using a multivariate optical computation (MOC) platform in order to enhance chemical analysis during formation tester logging operations. The platform allows access up to the mid-infrared (λ ~ 3.5 microns) optical region, which has previously not been accessible for in-situ real-time chemical measurements in a petroleum well environment. The new technique has been used in the field for the analysis of carbon dioxide and synthetic drilling fluid components such as olefins.
MOC is a technique that uses an integrated computational sensor to perform an analog dot product regression calculation on spectroscopic data, optically, rather than by electronic digital means. Historically, a dot product regression applied to spectroscopic data requires both a spectrometer and a digital computer in order to perform a chemical analysis. MOC sensors require neither and because the key sensor component, the multivariate optical element (MOE), is a stable temperature robust solid-state element, the technique is well suited for downhole petroleum environments. A new dual-core configuration using two MOEs designed to work in parallel enhances the sensitivity of the measurement enabling a mid-infrared analysis.
Spectroscopic measurements were performed on 32 base oils that were reconstituted to reservoir compositions over a wide temperature and pressure range up to 350°F and 20,000 psi for a total of 12 combinations for each base oil. Live gas compositions (i.e. reservoir conditions) were achieved by adding multiple methane, ethane, propane, and carbon dioxide charges to each base fluid. The reconstituted petroleum fluids were further mixed with olefin-based synthetic drilling fluid (SDF). This rigorous experimental design data therefore allowed for solid state MOEs to be designed to operate under the same reservoir conditions. Laboratory validation showed measurement accuracy of +/-0.43 wt% for a range of 0 to 16 wt% CO2 and +/-0.4% from 0 to 10 wt% SDF. A wireline formation tester optical section was modified with the MOC dual-core configuration to enable the mid infrared detection of both carbon dioxide and olefins. This formation tester was then deployed in five wells collecting seven samples from various locations. The downhole SDF and carbon dioxide measurements were subsequently found to be in good agreement with laboratory analysis with field results for valid pumpouts showing an accuracy of 0.5 wt% CO2 and 1.0 wt% olefins cross a range of 1.2 to 22 wt% CO2 and 1.4 to 9.7 wt% SDF.
Carbon dioxide is an important component of petroleum whose presence and concentration affects completion options, surface facilities, and flow assurance, which thereby affects operational costs of petroleum production. Olefins are a primary component of synthetic drilling fluid (SDF), although other mid-infrared active components such as esters, ketones, alcohols, and amines also can be present. High concentrations of SDF in openhole formation tester samples lower the quality of laboratory phase behavior analysis and thereby force greater monetary risk in development of assets, especially when conducting reservoir production simulations. Therefore, it is important to monitor both components during formation tester sampling operations.
Maintaining well control is one of the most important considerations of any drilling operation, and early detection of formation fluid influx or mud losses is vital to safe drilling.
Today's gain and loss detection tends to trigger too many false alarms; major improvements in reliability (few to no false alarms) and reactivity (no missed events) are needed without being user dependent. The new developed system optimizes both accuracy and efficiency. This system maintains a false alarm rate lower than current system, while detecting influxes or losses as low as 40 gal. It applies also to a wide variety of configuration: deepwater, managed pressure drilling, land rig operations, etc. This performance is achieved through a new flow-modeling processes combined with automated settings, real-time quality control and guided, intuitive software interfaces. From a purely user-dependent system, the new kick detection software is now based on automated processes, ensuring repeatable and optimal detection performances while minimizing risks of human error. The detection of abnormal flow conditions in the well relies on the comparison of predicted and measured flow at the exit of the well. The improvements of the flow modeling, such as new, calibrated pump-efficiency models based on the isothermal modeling of the pumps, increase the robustness and the reactivity of the detection system. The presented case studies allow quantifying improvements of the kick detection performance between the existing system and the new version, benchmarking both the influx-detection reactivity and the system reliability. Kick detection charts used in the study represent a new way of illustrating detection performances.
Moreno Ortiz, Jaime Eduardo (Schlumberger) | Gossuin, Jean (Schlumberger) | Liu, Yunlong (Schlumberger) | Klemin, Denis (Schlumberger) | Gurpinar, Omer (Schlumberger) | Gheneim Herrera, Thaer (Schlumberger)
Challenges on EOR process upscaling have been discussed extensively in the industry and effects of diffusion, dispersion, heterogeneity, force balance and frontal velocity -among others, recognized and qualified, along with the importance of understanding the numerical model finite difference equations and modeling strategy. Augmenting the upscaling complexity is the often-limited understanding/data on the EOR displacement at different scales (from micro to full field), including the EOR agent/rock/fluid interactions that is often available at the early stages of the EOR process de-risking.
A common denominator for the EOR process characterization and upscaling (along with the discretization of the displacement) is the non-uniqueness nature of the problem. As the complexity of numerical representation of the EOR process increases (thus increasing data characterization requirements), so does the number of plausible solutions and challenges when dealing with an otherwise incomplete dataset. Digital rock has evolved as a strong alternative to complement laboratory corefloods, allowing for EOR agent optimization on a high-resolution digital representation of the pore structure, detailed digital fluid model of both reservoir fluids and EOR agents and physical rock-EOR agent-reservoir fluid interaction, thus providing several calibration points to ensure the finite-difference model calibration and upscaling preserve the process behavior.
This paper discusses the use of digital rock solutions on the EOR deployment, particularly on translating the results to numerical finite difference models, addressing the inherent laboratory measurement uncertainty and proposing a fit-for-purpose multi-scale upscaling strategy that addresses both effects of heterogeneity and EOR agent characterization during the upscale process.
This paper addresses the challenges of chemical flooding upscaling, particularly polymer by using a real-life polymer injection case where digital rock, corefloods and more importantly pilot results are available to test and validate our observations. Using a polymer coreflood and digital rock results as input, numerical finite difference simulation models were built and calibrated to effectively reproduce the displacement physics observed on both digital rock and corefloods, digital flood results were used to bridge the laboratory-to-numerical model step by providing effective upscaled polymer properties as well as intrinsic rock properties such as relative permeability and capillary pressures, which are then taken through a series of multi-scale finite difference models to identify, validate and quantify upscaling requirements, addressing polymer deformation through pore throats and effective simulation viscosity. Digital rock is used to rank and resolve ambiguity on the finite difference model calibration by providing an otherwise rare opportunity to visualize the displacement in the 3D space. The analysis shed a new light on fluid-fluid and fluid-rock interaction at pore scale and enabled us to improve on the finite difference model generation and polymer properties.
Along-hole depth is the most fundamental subsurface measurement made in our business. Driller's depth, the basis of logging while drilling (LWD) data registration, and wireline logged depth are the primary sources of along-hole depth data. These are rarely congruent. This paper sets out a methodology that can be applied to both wireline and drill pipe along-hole depth measurement (Driller's Way-point Depth, DwpD) and provides an uncertainty calculation that results in True Along-hole (TAH) depth. The result is a reduction in differences between drill pipe and wireline depth measurements, increasing confidence in along-hole depth measurements and increasing well data value. A field data set is reviewed that showed up the similarities and the differences between DwpD and the wireline corrections.
Robust downhole fluid analysis highly relies on quality measurements and data mapping from various optical sensors implemented into modern formation testers. The direct modeling used to determine a multivariate correlation between optical sensor responses and diverse fluid compositions or properties is often cost-prohibitive with sensor-based calibration. This paper presents a novel method based on concatenated optical computing neural networks (COCN), which link sensor-specific signal transformation to generic fluid characterization through validated data mapping.
The COCN models are built separately in the in-house laboratory using optical data transformation networks and multi-analyte fluid characterization networks. To evaluate the uncertainty of sensor signal mapping, additional sensor data prediction networks, which exchange inputs and outputs of sensor data transformation networks to provide reverse data mapping, can be built to produce the simulated sensor responses. The actual sensor measurements are then compared to the simulated sensor responses to validate the optical signal transformation. The degree of matching can be used to identify issues associated with the quality of the sensor measurements and the range of calibration and testing data, and to select the transformation networks in ruggedizing predictions of fluid compositions and properties.
This paper presents the method used and case studies to exemplify the application of the proposed approach for reliable fluid sample characterization using laboratory and field measurement data. Calibrating sensor-independent fluid predictive networks on a large database usually pertains to low uncertainty because the analyzed measurements can be collected from diverse reservoir fluid samples. The construction of data transformation networks, however, is only realistic by using a small number of reference fluids because of their high calibration cost, which conversely challenges the quality of data mapping with new field sensor measurements. In this study, sensor data transformation networks and sensor data prediction networks form two groups of mutually complementing models that can validate one another, making the quality of data mapping observable before applying fluid characterization models. The results of the case studies demonstrate that the consistency between the actual and simulated sensor responses is a reasonable performance index in the prediction of fluid compositions and properties. The laboratory and field examples also justified the importance of using simulation data appropriately to overcome the limitation of sensor measurement data in the calibration of transformation networks.
Galford, James (Halliburton) | Ortiz, Ricardo (Halliburton) | Neely, Jeffrey (Halliburton) | Heaton, Jennifer (Halliburton) | Vehra, Imran (Halliburton) | Wu, Junchao (Halliburton) | Leung, Matthew (Halliburton) | Chandrashekar, Natesh (Halliburton)
Today’s fast-paced development of petroleum resources depends on an efficient and accurate evaluation of both clastic and unconventional reservoirs. A new high-performance, slim logging-while-drilling (LWD) natural gamma ray spectroscopy tool has been developed to assist real-time petrophysical evaluations of net-to-gross for conventional reservoirs and to identify "sweet spots" for completion for unconventional reservoirs. Additionally, its azimuthal sensitivity can help position the well in lateral operations.
This new tool provides wireline quality formation thorium (Th), uranium (U), and potassium (K) elemental concentrations in real time that can quantify clay content, identify clay minerals, and estimate total organic content. Further, real-time processing provides a color display derived from a Briggs color cube rendition of relative elemental contributions that can be correlated with stratigraphic features in the field. This first-of-its-kind, real-time feature is output at a high sampling rate for the full azimuth of the borehole and should be a useful aid in geosteering applications where the goal is to maintain the borehole within a target formation or to follow a known stratigraphic feature.
Calibration and characterization of the tool were performed using newly developed Monte Carlo modeling techniques superior to previously used laboratory techniques while maintaining direct links to industry standards at the API Gamma Ray Calibration and K-U-Th Logging Calibration Facilities at the University of Houston. These techniques were developed because the borehole at the API Gamma Ray Calibration Facility cannot accept the 5.25-in. collar diameter, and the potassium formation at the API K-U-Th Logging Calibration Facility is not reliable. The instrument is fully characterized for operations in barite-, hematite-, or formate-weighted water-based mud systems as well as barite- or hematite-weighted oil-based muds. Further, corrections for borehole potassium can be applied in real time.
A novel mechanical design enables the sensor to operate at temperature up to 329°F and borehole pressure up to 25,000 psi while minimizing the attenuation of formation gamma rays entering the detector and maintaining good azimuthal sensitivity. The tool uses a robust, constrained, weighted least-squares (WLS) analysis to derive elemental concentrations from measured pulse-height gamma ray spectra. Proprietary spectral processing algorithms regulate the detector gain without the use of an additional radioactive reference source and compensate for variations of the detector’s energy resolution caused by operating conditions within the borehole that change over time. A general description of the tool together with its operational specifications, details of the computer models used to calibrate and characterize its responses, and example logs from early field trials are within this paper.
Al-Maqtari, Ameen N. (SAFER E&D Operations Company) | Saleh, Ahmed A. (SAFER E&D Operations Company) | Al-Haygana, Adel (SAFER E&D Operations Company) | Al-Adashi, Jaber (SAFER E&D Operations Company) | Alogily, Abdulkhalek (SAFER E&D Operations Company) | Warren, Cassandra (Schlumberger) | Mavridou, Evangelia (Schlumberger) | Schoellkopf, Noelle (Schlumberger) | Sheyh Husein, Sami (Schlumberger) | Ahmad, Ammar (Schlumberger) | Baig, Zeeshan (Schlumberger) | Teumahji, Nimuno Achu (Schlumberger) | Thiakalingam, Surenthar (Schlumberger) | Khan, Waqar (Schlumberger) | Masurek, Nicole (Schlumberger) | Andres Sanchez Torres, Carlos (Schlumberger)
A 3D petroleum systems model (PSM) of Block 18 in the Sab'atayn basin, onshore western Yemen, was constructed to evaluate the untapped oil and gas potential of the Upper Jurassic Madbi formation. 3D PSM techniques were used to analyze petroleum generation for conventional reservoirs and the petroleum saturations retained in the source rock for the unconventional system. Block 18 has several proven petroleum systems and producing oil and gas fields. The principal source rocks are within the Madbi Formation, which comprises two units, the Lam and the Meem members. Both contain transgressive organically rich "hot" shales with total organic carbon (TOC) of 8 to 10%; these are located stratigraphically at the base of each member. Additional organic-rich intervals within the Lam and Meem are less-effective source rocks, with lower TOC values.
The PSM consisted of 17 depositional events and 2 hiatuses. To accurately replicate geochemical and stratigraphic variations, the Lam and Meem members were further divided into sublayers. The model was calibrated to present-day porosity, permeability, and pressure data, and it incorporated vertical and lateral lithofacies and organic facies variations. Further calibrations used observed maturities (vitrinite reflectance and pyrolysis Tmax) and present-day temperatures and considered laterally variable heat flow from the Early Jurassic to the Late Miocene. Finally, petrophysical analyses from wells provided calculated hydrocarbon saturations, which were used to calibrate the saturation output from the model. The model satisfactorily reproduces the distribution of the main gas and oil fields and discoveries in the study area and is aligned with well test data.
Maturity results indicate that the upper Lam intervals currently sit within the main to early oil window but are immature at the edges of Block 18 (based on the Sweeney and Burnham Easy R0% kinetics). The lowest Lam unit enters the wet gas window in the center of the block. The underlying Meem member ranges from wet gas to early oil window maturity. Like the Lam, the Meem remains immature along the edges of Block 18. However, in the south of the block, the richest source rocks within the Meem are mainly in the oil window. The degree of transformation of the Meem and Lam varies throughout the members. The model predicts that, at present, the lowest part of the Meem, containing the greatest TOC, has 90% of its kerogen transformed into hydrocarbons.
The model confirms that the Madbi formation is a promising unconventional shale reservoir with a high quantity of hydrocarbons retained within it. Despite the higher quantity of hydrocarbons retained in the upper Meem, in terms of liquid and vapor hydrocarbons predicted in this model, the lower Lam is the most-prospective conventional tight sand reservoir, and the Meem has very small potential as tight sand reservoirs. This study provided a novel application of 3D PSM technology to assess new unconventional as well as conventional plays in this frontier area.
Morales, Adrian (Chesapeake Energy Corp.) | Holman, Robert (Chesapeake Energy Corp.) | Nugent, Drew (Chesapeake Energy Corp.) | Wang, Jingjing (Chesapeake Energy Corp.) | Reece, Zach (Chesapeake Energy Corp.) | Madubuike, Chinomso (Chesapeake Energy Corp.) | Flores, Santiago (Chesapeake Energy Corp.) | Berndt, Tyson (Chesapeake Energy Corp.) | Nowaczewski, Vincent (Chesapeake Energy Corp.) | Cook, Stephanie (Chesapeake Energy Corp.) | Trumbo, Amanda (Chesapeake Energy Corp.) | Keng, Rachel (Chesapeake Energy Corp.) | Vallejo, Julieta (Chesapeake Energy Corp.) | Richard, Rex (Chesapeake Energy Corp.)
An integrated project can take many forms depending on available data. As simple as a horizontally isotropic model with estimated hydraulic fracture geometries used for simple approximations, to a large scale seismic to simulation workflow. Presented is a large-scale workflow designed to take into consideration a vast source of data.
In this study, the team investigates a development area in the Eagle Ford rich in data acquisition. We develop a robust workflow, taking into account field data acquisition (seismic, 4D seismic and chemical tracers), laboratory (geomechanical, geochemistry and PVT) measurements and correlations, petrophysical measurements (characterization, facies, electrical borehole image), real time field surveillance (microseismic, MTI, fracture hit prevention and mitigation program through pressure monitoring) and finally integrating all the components of a complex large scale project into a common simulation platform (seismic, geomodelling, hydraulic fracturing and reservoir simulation) which is used to run sensitivities.
The workflow developed and applied for this project can be scaled for projects of any size depending on the data available. After integrating data from various disciplines, the following primary drivers and reservoir understanding can be concluded. At a given oil price, optimum well spacing for a given completion strategy can be developed to maximize rate of return of the project. Many operators function in isolated teams with a genuine effort for collaboration, however genuine effort is not enough for a successful integrated modelling project, a dedicated multidisciplinary team is required.
We present what is to our knowledge, one of the most complete data sets used for an integrated modelling project to be presented to the public. The specific lessons from the project are applied to future Eagle Ford projects, while the overall workflow developed can be tailored and applied to any future field developments.
This study examines which is the margin of usability for Artificial Intelligence (AI) algorithms related to the rock properties distribution in static modeling. This novel method shows a forward modeling approach using neural networks and genetic algorithms to optimize correlation patterns among seismic traces of stack volumes and well rock properties. Once a set of nonlinear functions is optimized in the well locations, to correlate seismic traces and rock properties, spatial response is estimated using the seismic volume. This seismic characterization process is directly dependent on the error minimization during the structural seismic interpretation process, as well as, honoring the structural complexity while modeling. Previous points are key elements to obtain an adequate correlation between well data and seismic traces. The joint mechanism of neural networks and genetic algorithms globally optimize the nonlinear functions and its parameters to minimize the cost function. Estimated objective function correlates well rock properties with seismic stack data. This mechanism is applied to real data, within a high structural complexity and several wells. As an output, calibrated petrophysical time volumes in the interval of interest are obtained. Properties are used initially to generate a geological facies model. Subsequently, facies and seismic properties are used for the three-dimensional distribution of petrophysical properties such as: rock type, porosity, clay volume and permeability. Therefore, artificial intelligence algorithms can be widely exploited for uncertainty reduction within the rock property spatial estimation.