|Theme||Visible||Selectable||Appearance||Zoom Range (now: 0)|
Pan, Jin (Wuhan University of Technology, Wuhan) | Wang, Tao (Wuhan University of Technology, Wuhan) | Xu, Ming Cai (Huazhong University of Science and Technology, Wuhan / Collaborative Innovation Centre for Advanced Ship and Deep-Sea Exploration) | Gao, Gui (Wuhan-Jiujiang Railway Passenger Transportation Hubei Co. Ltd.)
The hull block erection network process, which is performed during the master production planning stage of the shipyard, is frequently delayed because of limited resources, workspace, and block preparation ratio. In this study, a study to predict the delay with respect to the block erection schedule is conducted by considering the variability of the block preparation ratio based on the discrete event simulation algorithm. It is confirmed that the variation of the key event observance ratio is confirmed according to the variability caused by the block erection process, which has the minimum lead time in a limited resource environment, and the block preparation ratio. Furthermore, the optimal pitch value for the key event concordance is calculated based on simulation results.
ABSTRACT The industry is facing significant challenges due to the recent downturn in oil prices, particularly for the development of tight reservoirs. It is more critical than ever to 1) identify the sweet spots with less uncertainty and 2) optimize the completion-design parameters. The overall objective of this study is to quantify and compare the effects of reservoir quality and completion intensity on well productivity. We developed a supervised fuzzy clustering (SFC) algorithm to rank reservoir quality and completion intensity, and analyze their relative impacts on wells' productivity. We collected reservoir properties and completion-design parameters of 1,784 horizontal oil and gas wells completed in the Western Canadian Sedimentary Basin. Then, we used SFC to classify 1) reservoir quality represented by porosity, hydrocarbon saturation, net pay thickness and initial reservoir pressure; and 2) completion-design intensity represented by proppant concentration, number of stages and injected water volume per stage. Finally, we investigated the relative impacts of reservoir quality and completion intensity on wells' productivity in terms of first year cumulative barrel of oil equivalent (BOE). The results show that in low-quality reservoirs, wells' productivity follows reservoir quality. However, in high-quality reservoirs, the role of completion-design becomes significant, and the productivity can be deterred by inefficient completion design. The results suggest that in low-quality reservoirs, the productivity can be enhanced with less intense completion design, while in high-quality reservoirs, a more intense completion significantly enhances the productivity. Keywords Reservoir quality; completion intensity; supervised fuzzy clustering, approximate reasoning,tight reservoirs development
Sochovka, Jon (Liberty Oilfield Services) | George, Kyle (Liberty Oilfield Services) | Melcher, Howard (Liberty Oilfield Services) | Mayerhofer, Mike (Liberty Oilfield Services) | Weijers, Leen (Liberty Oilfield Services) | Poppel, Ben (Liberty Oilfield Services) | Siegel, Joel (Liberty Oilfield Services)
Abstract The shale industry has changed beyond recognition over the last decade and is once again in rapid transition. While we are unsure about the nature of innovations to make US shale ever more competitive, we are certain that the current downturn will drive a further reduction in $/BO – the total cost to lift a barrel of US shale oil to the surface. As a result of an increase in scale and industry efficiency gains, the all-in price charged by service companies to place a pound of proppant downhole has come down from more than $0.50/lb in 2012 to about $0.10/lb today. In this paper, we discuss what components have contributed to this reduction to date and use several case studies to illustrate the potential for further cost reductions. The authors used FracFocus data to study a variety of placement and production chemicals for about 100,000 horizontal wells in US liquid rich basins, including the Williston, Powder River, DJ, Permian basins, as well as SCOOP/STACK and Eagle Ford. All chemicals used were averaged on a per-well basis into a gallon-per-thousand gallons (gpt) metric. In the paper, we first provide an overview of trends by basin since 2010 for these chemical additives. Then, we perform Multi-Variate Analysis (MVA) to determine if groups of these chemicals show an impact on production performance in specific basins or formations. Finally, through integration of lab testing (on fluid systems and proppants), a liquid-rich shale production database and FracFocus tracking of industry trends, the authors developed a list of case histories that show modest to significant reductions in $/BO. In this paper we focus on proppant delivery cost – the cost to place a pound of proppant in a fracture downhole, where it can contribute to a well's production for years to come. The last decade saw a 10-fold increase in horsepower, a 20-fold increase in yearly stages pumped and a 40-fold yearly proppant mass increase. One result of this increase in scale, was a gain in efficiencies, which led to an average 3-fold fracturing cost decrease to place a pound of proppant downhole. We will document this trend in detail in the paper. A significant industry trend over the last decade has been a "viscosity for velocity" trade. The change to smaller mesh regional proppants, in combination with an increase in pump rates on frac jobs in the US, has allowed fluid systems to become more "watery". At the same time, the industry is moving from guar systems to polyacrylamide-based systems that exhibit higher apparent viscosities at low to ultra-low shear rates. These newer High Viscosity Friction Reducer (HVFR) systems show superior proppant carrying capacity over traditional slickwater fluid systems. Regained conductivity testing has shown that these HVFR systems are generally cleaner for fracture conductivity than guar systems. Along with changes to base chemistry, a 2- to 5-fold increase in disposal costs and an overall "green initiative" over the last decade have resulted in a push to maximize recycled water usage on these HVFR jobs. These waters can be in excess of 150,000 TDS (Total Dissolved Solids) which present challenges across the board when designing a compatible fluid system that fits the needs in terms of viscosity yield, scale inhibition and microbial mitigation etc. – all while keeping costs low. Specialty chemicals, such as Hydrochloric Acid (HCl) substitutes that have similar efficacy as HCl but significantly lower reactivity with human skin, have helped significantly to improve operational safety around previously-categorized hazardous chemicals, and have helped reduce cost and improve pump time efficiency. Measurement of bacterial activity during and after fracture treatments can help with the best economic selection of the appropriate biocide. These simple measurements can help further reduce what is spent on the necessary chemical package to effectively treat a well. This paper provides a holistic view of fluid selection issues and shows a real-data focused methodology to further support a leaner approach to hydraulic fracturing.
Summary In this work, we investigate the efficient estimation of the optimal design variables that maximize net present value (NPV) for the life-cycle production optimization during a single-well carbon dioxide (CO2) huff-n-puff (HnP) process in unconventional oil reservoirs. A synthetic unconventional reservoir model based on Bakken Formation oil composition is used. The model accounts for the natural fracture and geomechanical effects. Both the deterministic (based on a single reservoir model) and robust (based on an ensemble of reservoir models) production optimization strategies are considered. The injection rate of CO2, the production bottomhole pressure (BHP), the duration of injection and the production periods in each cycle of the HnP process, and the cycle lengths for a predetermined life-cycle time can be included in the set of optimum design (or well control) variables. During optimization, the NPV is calculated by a machine learning (ML) proxy model trained to accurately approximate the NPV that would be calculated from a reservoir simulator run. Similar to the ML algorithms, we use both least-squares (LS) support vector regression (SVR) and Gaussian process regression (GPR). Given a set of forward simulation runs with a commercial compositional simulator that simulates the miscible CO2 HnP process, a proxy is built based on the ML method chosen. Having the proxy model, we use it in an iterative-sampling-refinement optimization algorithm directly to optimize the design variables. As an optimization tool, the sequential quadratic programming (SQP) method is used inside this iterative-sampling-refinement optimization algorithm. Computational efficiencies of the ML proxy-based optimization methods are compared with those of the conventional stochastic simplex approximate gradient (StoSAG)-based methods. Our results show that the LS-SVR- and GPR-based proxy models are accurate and useful in approximating NPV in the optimization of the CO2 HnP process. The results also indicate that both the GPR and LS-SVR methods exhibit very similar convergence rates, but GPR requires 10 times more computational time than LS-SVR. However, GPR provides flexibility over LS-SVR to access uncertainty in our NPV predictions because it considers the covariance information of the GPR model. Both ML-based methods prove to be quite efficient in production optimization, saving significant computational times (at least 4 times more efficient) over a stochastic gradient computed from a high-fidelity compositional simulator directly in a gradient ascent algorithm. To our knowledge, this is the first study presenting a comprehensive review and comparison of two different ML-proxy-based optimization methods with traditional StoSAG-based optimization methods for the production optimization problem of a miscible CO2HnP.
Summary The nonparametric transformation is a data-driven technique, which can be used to estimate optimal correlations between a dependent variable (response) and a set of independent parameters (predictors). This study introduces a systematic methodology using the nonparametric transformation concept and the alternating conditional expectation (ACE) algorithm to estimate the effective gas permeability using conventional logs and the core data. The ACE algorithm was employed in the current work using the MATLAB® (The MathWorks, Inc., Natick, Massachusetts, USA) code and the open-source GRaphical ACE (GRACE) software (Xue et al. 1997) for deriving the optimal nonparametric correlations for predicting the permeability. The methodology was applied to a heterogeneous formation [Bahariya (BAH)] in Egypt to understand its characteristics and predict its permeability more accurately. The BAH Formation is considered one of the main sources for oil production throughout the Western Desert (WD) of Egypt. The cumulative oil production from the BAH Formation is estimated to be approximately 40% of the total WD production. The reservoir characteristics of the BAH Formation range from highly permeable to tight sandstone interbedded with shale and siltstone. It usually depicts low-resistivity and low-contrast (LRLC) log behavior. Thus, regional and accurate determination of the reservoir permeability for the different rock units of the BAH Formation across the WD is a challenge. Conventional well log data from approximately 100 cored wells and corresponding 5,500 core measurements were used to provide a regional permeability correlation that can be used in a large number of reservoirs. The methodology of this work included two main steps: Applying the nonparametric transformation technique to identify the collective log responses for deriving optimal correlation Predicting the permeability profiles using the selected log responses The model was applied to many wells that address different petrophysical characteristics of the BAH Formation. The established permeability profiles showed reliable correlation coefficients relative to the measured core data. The correlation coefficient was 0.893 for the training data points (75% of the collected database) and 0.913 for the testing data points (25% of the collected database). In addition, the mean absolute percentage error (MAPE) between the predicted and the measured permeability for the training and testing data points were 5.93 and 4.14%, respectively. Permeability prediction using ACE is compared with other techniques such as k-ϕ crossplots, multiple linear regression (MLR), Coates, and Wyllie-Rosecorrelations. This work is considered an original contribution to present regional permeability prediction correlations using the conventional well logs for reservoir characterization and simulation applications. The ACE algorithm was successfully applied to the BAH Formation and proved its capability to identify the best predictors that are required to establish a rigorous model.
Abstract Basin-wide heterogeneity of production in unconventional resources creates additional risk in field development planning. In the past few years, several data-driven models have been developed to increase the accuracy in predicting the recovery from shale gas and tight oil wells. However, many of the machine learning methods with so called "black box" approach provide deterministic results. Therefore, understanding the uncertainty associated with different development scenarios would be difficult to obtain. We have investigated the underlying statistical distribution functions that govern the production rates and decline behavior of unconventional wells. Identification and quantification of these distribution functions provide a strong tool to accurately forecast the cumulative production of a large group of wells in an unconventional basin. By understanding the relationship among geologic characteristics of different sections of the asset, and the impact of varying drilling and completion parameters, capital allocation can be done in a more efficient manner. In this paper, we have identified the statistical distribution parameters of decline behavior is a Power Law model. In doing so, we have used unsupervised clustering techniques to find an optimal number of clusters that enable observing well behaved and identifiable underlying distribution functions. Furthermore, we quantified different types of distribution functions in a trial and error workflow to provide a tool for accurately evaluating the impact of varying geologic parameters on the decline behavior of these wells. Our results show that the leading term (or leading coefficient), which also highly correlates with long term cumulative recovery, demonstrates Gamma distribution, while the power degree (or power coefficient) demonstrate Normal distribution. Peak production rate (maximum average daily rate), terminal rate (rate after switch point), and the time of terminal rate occurrence, all demonstrate Log Normal distribution.
Abstract As operators shift their focus toward operating within cashflow, understanding the true potential of these unconventional resources is becoming increasingly important. Simultaneously, accurate modeling of EURs in shale wells is becoming increasingly complicated. There are multiple factors at play for this increase in complexity, key amongst them, are well interactions. Well interactions or interference have increased with the concentration of field development in core areas of various basins and have completely changed with production behavior in shale wells. The present paper handles this multi-variable problem by incorporating well design, completion and petrophysical variables in a prediction model. Furthermore, the analysis is presented from a viewpoint of parent, child, parent/child and co-completed wells to accurately understand the variability in the driving factors. Terminal decline rate in shale wells is the decline rate wells settle at once the pressure transient reaches the boundary of the well. At this point, the well transitions to a boundary dominated flow regime and continues to drain from a fixed area. Estimating the rate of terminal decline is critical in accurate EUR modeling because changes in transition point can have a significant impact on production behavior of the well and in-turn EUR. The present paper attempts to predict the transition point using an ACE Non-Linear Regression model which is trained on a large multi-variate dataset. Variables incorporated in this analysis include terminal decline month, gas-oil-ratio based of the first three months of production, horizontal length, oil EUR, proppant per foot, average distance from the base of the producing zone, nearest neighbor mean spacing, and hydrocarbon in-place. In order to determine spacing status and nearest wellbore distances, a segment-wise analytical distance approach was taken. These distances and spacing status flags were incorporated into a multi-variate model in-order to model terminal decline rates. The transformations observed from the model showed high dependence on terminal decline month and oil EUR. However, this was less pronounced in parent/child and child wells. In parent/child and child wells completion metrics and HCIP more significantly influenced production behavior. Specifically, child wells saw a higher dependence on first three-month GOR and lateral length compared to parent/child wells which had a higher dependence on proppant per foot and average distance from the base of the producing formation. Additionally, spacing showed a moderate impact on transition point and associated terminal decline rates, but overall increased spacing caused a delayed transition point and consequently a lower terminal decline rate. Understanding how cause-and-effect relationships between parent and child wells differ offers a unique perspective into production behavior and consequently provides better insights into infill wells placement and production prediction. The present paper offers a unique perspective in looking at a key decline variable, transition point, for shale reservoirs. By using multivariate analysis, it incorporates the incremental complexity of the modeling effort and attempts to provide best practices in understanding the impact on production behavior. Furthermore, by incorporating a segment-wise analytical distance approach to determine spacing, the paper adds to the existing body of literature by providing a new perspective for a well interaction standpoint and defines the cause and effect relationships within.
Abstract Determining the closure pressure is crucial for optimal hydraulic fracturing design and successful execution of fracturing treatment. Historically, the use of diagnostic tests before the main fracturing treatment has significantly advanced to gain more information about the pattern of fracture propagation and fluid performance to optimize the designs. The goal is to inject a small volume of fracturing fluid to breakdown the formation and create small fracture geometry, then once pumping is stopped the pressure decline is analyzed to observe the fracture closure. Many analytical methods such as G-Function, square root of time, etc. have been developed to determine the fracture closure pressure. There are cases in which there is difficulty in determining the fracture closure pressure, as well as personal bias and field experiences make it challenging to interpret the changes in the pressure derivative slope and identify fracture closure. These conditions include: High permeability reservoirs where fracture closure occurs very fast due to the quick fluid leakoff. Extremely low permeability reservoir, which requires a long shut-in time for the fluid to leak off and determine the fracture closure pressure. The non-ideal fluid leak-off behavior under complex conditions. The objective of this study is to apply machine learning methods to implement a predesigned algorithm to execute the required tasks and predict the fracture closure pressure while minimizing the shortcomings in determining the closure pressure for non-ideal or subjective conditions. This paper demonstrates training different supervised machine learning algorithms to help predict fracture closure pressure. The workflow involves using the datasets to train and optimize the models, which subsequently are used to predict the closure pressure of testing data. The output results are then compared with actual results from more than 120 DFIT data points. We further propose an integrated approach to feature selection and dataset processing and study the effects of data processing on the success of the model prediction. The results from this study limit the subjectivity and the need for the experience of personal interpreting the data. We speculate that a linear regression and MLP neural network algorithms can yield high scores in the prediction of fracture closure pressure.
Abstract Chemical flooding has been widely used to enhance oil recovery after conventional waterflooding. However, it is always a challenge to model chemical flooding accurately since many of the model parameters of the chemical flooding cannot be measured accurately in the lab and even some parameters cannot be obtained from the lab. Recently, the ensemble-based assisted history matching techniques have been proven to be efficient and effective in simultaneously estimating multiple model parameters. Therefore, this study validates the effectiveness of the ensemble-based method in estimating model parameters for chemical flooding simulation, and the half-iteration EnKF (HIEnKF) method has been employed to conduct the assisted history matching. In this work, five surfactantpolymer (SP) coreflooding experiments have been first conducted, and the corresponding core scale simulation models have been built to simulate the coreflooding experiments. Then the HIEnKF method has been applied to calibrate the core scale simulation models by assimilating the observed data including cumulative oil production and pressure drop from the corresponding coreflooding experiments. The HIEnKF method has been successively applied to simultaneously estimate multiple model parameters, including porosity and permeability fields, relative permeabilities, polymer viscosity curve, polymer adsorption curve, surfactant interfacial tension (IFT) curve and miscibility function curve, for the SP flooding simulation model. There exists a good agreement between the updated simulation results and observation data, indicating that the updated model parameters are appropriate to characterize the properties of the corresponding porous media and the fluid flow properties in it. At the same time, the effectiveness of the ensemble-based assisted history matching method in chemical enhanced oil recovery (EOR) simulation has been validated. Based on the validated simulation model, numerical simulation tests have been conducted to investigate the influence of injection schemes and operating parameters of SP flooding on the ultimate oil recovery performance. It has been found that the polymer concentration, surfactant concentration and slug size of SP flooding have a significant impact on oil recovery, and these parameters need to be optimized to achieve the maximum economic benefit.
Abstract Leveraging publicly available data is a crucial stepfor decision making around investing in the development of any new unconventional asset.Published reports of production performance along with accurate petrophysical and geological characterization of the areashelp operators to evaluate the economics and risk profiles of the new opportunities. A data-driven workflow can facilitate this process and make it less biased by enabling the agnostic analysis of the data as the first step. In this work, several machine learning algorithms are briefly explained and compared in terms of their application in the development of a production evaluation tool for a targetreservoir. Random forest, selected after evaluating several models, is deployed as a predictive model thatincorporates geological characterization and petrophysical data along with production metricsinto the production performance assessment workflow. Considering the influence of the completion design parameters on the well production performance, this workflow also facilitates evaluation of several completion strategies toimprove decision making around the best-performing completion size. Data used in this study include petrophysical parameters collected from publicly available core data, completion and production metrics, and the geological characteristics of theNiobrara formation in the Powder River Basin. Historical periodic production data are used as indicators of the productivity in a certain area in the data-driven model. This model, after training and evaluation, is deployed to predict the productivity of non-producing regions within the area of interest to help with selecting the most prolific sections for drilling the future wells. Tornado plots are provided to demonstrate the key performance driversin each focused area. A supervised fuzzy clustering model is also utilized to automate the rock quality analyses for identifying the "sweet spots" in a reservoir. The output of this model is a sweet-spot map that is generated through evaluating multiple reservoir rock properties spatially. This map assists with combining all different reservoir rock properties into a single exhibition that indicates the average "reservoir quality"of the formation in different areas. Niobrara shale is used as a case study in this work to demonstrate how the proposed workflow is applied on a selected reservoir formation whit enough historical production data available.