|Theme||Visible||Selectable||Appearance||Zoom Range (now: 0)|
In this paper, the authors introduce a novel semianalytic approach to compute the sensitivity of the bottomhole pressure (BHP) data with respect to gridblock properties. Although principal-component analysis (PCA) has been applied widely to reduce the number of parameters characterizing a reservoir, its disadvantages are well-recognized.
The classical tornado chart is obtained by fixing all but one input at some base value and letting the one input vary from its minimum to maximum. A similar graph is used in Monte Carlo simulation software, in which the bar widths represent either rank correlation coefficients or stepwise linear regression coefficients.
The complete paper explores the use of multilevel derivative-free optimization for history matching, with model properties described using principal component analysis (PCA) -based parameterization techniques. This paper describes an accurate, three-step, machine-learning-based early warning system that has been used to monitor production and guide strategy in the Shengli field. This paper discusses how machine learning by use of multiple linear regression and a neural network was used to optimize completions and well designs in the Duvernay shale.
The industry increasingly relies on forecasts from reservoir models for reservoir management and decision making. However, because forecasts from reservoir models carry large uncertainties, calibrating them as soon as data come in is crucial. The complete paper explores the use of multilevel derivative-free optimization for history matching, with model properties described using principal component analysis (PCA) -based parameterization techniques. The results of the authors’ research showed promising benefits from the use of a systematic procedure of model diagnostics, model improvement, and model-error quantification during data assimilations. A challenging problem of automated history-matching work flows is ensuring that, after applying updates to previous models, the resulting history-matched models remain consistent geologically.
Scale control and inhibition is very important for maintaining flow assurance of oil production. Several specialty chemicals are used to delay, reduce or prevent scale deposition and, in particular, polymers and phosphonate-based chemicals have been used extensively. The accurate and precise topside measurement of scale inhibitors plays an important role in assessing the efficiency of scale squeeze and continuous-chemical injection treatments. At present, numerous techniques exist for scale inhibitor (SI) analysis but each method has its own limitation and often these methods give results of either total chemical content or elemental analysis without details of chemical speciation. Furthermore, most techniques often lack the ability for on-site analysis on fresh produced water samples, which yields the potential for quick and more accurate and precise information due to minimal sample degradation.Nanotechnology-based Surface Enhanced Raman Spectroscopy (SERS) developed as the next-generation method to fill the gap in speciation of phosphonates and to determine low concentrations of different scale inhibitor chemicals in produced brines in a timely and cost-effective manner.Particular focus is placed upon the individual and mixed analysis of a novel phosphonate and Deta Phosphonate (DETPMP) respectively. Development of this method with handheld instrumentation provides better detection and quantification of scale inhibitors in the field and reduces time and cost compared to sending samples to off-site laboratories for data collection.
This work compares traditional petrophysical workflows to machine learning methods to compute porosity and permeability in the Inyan Kara Formation of the Dakota Group in the Williston Basin. Quality of dataset (variability, size and robustness), basic theoretical approach, method, results, applicability in the geological area and predictive power are the fundamental basis of this comparison.
Produced water disposal into a shallow Dakota Group presents challenges to nearby development drilling, potentially generating high-pressure zones in the overburden. Accurate pressure prediction for the Dakota is very valuable for drilling design and development timing decisions. Along with disposal volumes, porosity is the critical variable which determines the formation pressure. The Dakota does not contain hydrocarbons resulting in limited porosity log data. However, there are abundant resistivity and gamma ray logs through this interval. Therefore, a data analytics model is used to compute porosity based on location and available resistivity and gamma ray logs. Also, a modified Archie’s equation is used to compute porosity from resistivity and gamma ray data. Data analytics-based porosity and resistivity-based porosity are compared with available neutron-density log derived porosities.
The team working this project comprised a Petrophysicist, Reservoir Engineer and Geologist who compiled, analyzed and built a geo- and reservoir model based on this work to predict formation pressure in the Dakota Sands.
The Williston Basin, North Dakota, has become one of the largest oil producers in the United States. Large volumes of water are being produced along with oil, most of which is disposed in Inyan Kara formation of the Dakota Group. The Dakota Group is situated between the surface and the zones targeted for oil and gas development drilling. Water disposal into this shallower formation presents risks to nearby development drilling, potentially causing high pressure zones in the overburden. Accurate pore pressure predictions for the Dakota Group is very valuable for drilling design decisions. Along with disposal volumes, porosity is the critical variable which determines the formation pressure.