|Theme||Visible||Selectable||Appearance||Zoom Range (now: 0)|
The hull block erection network process, which is performed during the master production planning stage of the shipyard, is frequently delayed because of limited resources, workspace, and block preparation ratio. In this study, a study to predict the delay with respect to the block erection schedule is conducted by considering the variability of the block preparation ratio based on the discrete event simulation algorithm. It is confirmed that the variation of the key event observance ratio is confirmed according to the variability caused by the block erection process, which has the minimum lead time in a limited resource environment, and the block preparation ratio. Furthermore, the optimal pitch value for the key event concordance is calculated based on simulation results.
The submarine H. L. Hunley conducted the first successful submarine attack on an enemy vessel, USS Housatonic, during the American Civil War but was lost with all hands because of unknown circumstances. The submarine has been recovered, and recent archeological findings have uncovered that a spar torpedo was used as opposed to a standoff torpedo that was commonly assumed to have been used. As a result, the submarine would have been in close proximity to the weapon when it exploded than previously thought. A multipart investigation has been conducted with the goal of determining if this reduced standoff distance could explain the mysterious loss of the vessel in the minutes or hours after the attack. Here, the results of a bottom-up naval architectural and weapons-effects analysis are reported. Together, the experimental, computational, and analytical results provide new insight to the vessel’s stability characteristics, propulsion, and dynamic loading environment during the attack. In addition, a discussion of possible loss scenarios, informed by both calculation results and inspections of vessel’s hull, is presented. Although the story of what happened to H. L. Hunley that night remains shrouded in mystery after this work, several important new research questions emerge.
Suarez-Rivera, Roberto (W. D. Von Gonten Laboratories) | Panse, Rohit (W. D. Von Gonten Laboratories) | Sovizi, Javad (Baker Hughes) | Dontsov, Egor (ResFrac Corporation) | LaReau, Heather (BP America Production Company, BPx Energy Inc.) | Suter, Kirke (BP America Production Company, BPx Energy Inc.) | Blose, Matthew (BP America Production Company, BPx Energy Inc.) | Hailu, Thomas (BP America Production Company, BPx Energy Inc.) | Koontz, Kyle (BP America Production Company, BPx Energy Inc.)
Abstract Predicting fracture behavior is important for well placement design and for optimizing multi-well development production. This requires the use of fracturing models that are calibrated to represent field measurements. However, because hydraulic fracture models include complex physics and uncertainties and have many variables defining these, the problem of calibrating modeling results with field responses is ill-posed. There are more model variables than can be changed than field observations to constrain these. It is always possible to find a calibrated model that reproduces the field data. However, the model is not unique and multiple matching solutions exist. The objective and scope of this work is to define a workflow for constraining these solutions and obtaining a more representative model for forecasting and optimization. We used field data from a multi-pad project in the Delaware play, with actual pump schedules, frac sequence, and time delays as used in the field, for all stages and all wells. We constructed a hydraulic fracturing model using high-confidence rock properties data and calibrated the model to field stimulation treatment data varying the two model variables with highest uncertainty: tectonic strain and average leak-off coefficient, while keeping all other model variables fixed. By reducing the number of adjusting model variables for calibration, we significantly lower the potential for over-fitting. Using an ultra-fast hydraulic fracturing simulator, we solved a global optimization problem to minimize the mismatch between the ISIPs and treatment pressures measured in the field and simulated by the model, for all the stages and all wells. This workflow helps us match the dominant ISIP trends in the field data and delivers higher confidence predictions in the regional stress. However, the uncertainty in the fracture geometry is still large. We also compared these results with traditional workflows that rely on selecting representative stages for calibration to field data. Results show that our workflow defines a better global optimum that best represents the behavior of all stages on all wells, and allows us to provide higher-confidence predictions of fracturing results for subsequent pads. We then used this higher confidence model to conduct sensitivity analysis for improving the well placement in subsequent pads and compared the results of the model predictions with the actual pad results.
Abstract Uniformity of proppant distribution among multiple perforation clusters affects treatment efficiency in multistage fractured wells stimulated using the plug-and-perf technique. Multiple physical phenomena taking place in the well and perforation tunnels can cause uneven proppant distribution among multiple clusters. The problem has been studied in the recent years with experimental and computational fluid dynamics (CFD) methods, which provide useful insights but are impractical for routine designs. Simplified models that incorporated the proppant transport efficiency (PTE) correlation derived from the CFD results in a hydraulic fracture model have been also presented in literature. In this paper, we present a numerical model that simulates the transient proppant slurry flow in the wellbore, considering proppant transport and settling including bed formation, rate- and concentration-dependent pressure drop, PTE, and dynamic pressure coupling with the hydraulic fractures. The model is efficient and is designed to be an independent wellbore transport model so it can be integrated with any fracture models, including fully 3D and/or complex fracture network models, for practical design optimization. The model predictions are compared and found to agree with previously published studies. Parametric studies demonstrate sensitivity of proppant distribution to grain size, fluid viscosity, and pumping rate for fixed perforation designs. Analysis of the simulation results shows that the dominant cause of uneven proppant distribution is proppant inertia. Possible slurry stratification is less important, except for the cases with relatively low flow rates and near toe clusters. Accordingly, proppant distribution is less sensitive to perforation phasing than to the number of perforations in clusters. Alterations of the number of perforations per cluster within a stage enable achieving more even proppant distribution.
Liu, Xinghui (Chevron Corporation) | Wang, Jiehao (Chevron Corporation) | Singh, Amit (Chevron Corporation) | Rijken, Margaretha (Chevron Corporation) | Chrusch, Larry (Chevron Corporation) | Wehunt, Dean (Chevron Corporation) | Ahmad, Faraj (Colorado School of Mines) | Miskimins, Jennifer (Colorado School of Mines)
Abstract Multi-stage plug-n-perf fracturing of horizontal wells has proven to be an effective method to develop unconventional reservoirs. Various studies have shown uneven fluid and proppant distributions across all perforation clusters. It is commonly believed that both fracturing fluid and proppant contribute to unconventional well performance. Achieving uniform fluid and proppant placement is an important step toward optimal stimulation. This paper discusses how to achieve such uniform placement in each stage via a CFD (Computational Fluid Dynamics) modeling approach. CFD models in several lab scales were built and calibrated using experimental data of proppant transport through horizontal pipes in several laboratory configurations. A field-scale model was then built and validated using perforation erosion data from downhole camera observations and the same model parameters calibrated in the lab-scale model. With the field-scale model validated, CFD simulations were performed to evaluate the impact of key parameters on fluid and proppant placement in individual perforations and clusters. Some key parameters investigated in this study included perforation parameters (size, orientation, number), cluster spacing, cluster count per stage, fluid properties, proppant properties, pumping rates, casing sizes, and stress shadow effects, etc. Both lab and CFD results show that bottom-side perforations receive significantly more proppant than top-side perforations due to gravitational effects. Lab and CFD results also show that proppant distribution is increasingly toe-biased at higher rates. Proppant concentration along the wellbore from heel to toe generally varies significantly. Gravity, momentum, viscous drag, and turbulent dispersion are key factors affecting proppant transport in horizontal wellbores. This study demonstrates that near-uniform fluid and proppant placement across all clusters in each stage is achievable by optimizing perforation, cluster, and other treatment design factors. Validated CFD modeling plays an important role in this design optimization process.
Abstract Well spacing and stimulation design are amongst the highest impact design variables which can dictate the economics of an unconventional development. The objective of this paper is to showcase a numerical simulation workflow, with emphasis on the hydraulic fracture simulation methodology, which optimizes well spacing and completion design simultaneously. The workflow is deployed using Cloud Computing functionality, a step-change over past simulation methods. Workflow showcased in this paper covers the whole cycle of 1) petrophysical and geomechanical modeling, 2) hydraulic fracture simulations and 3) reservoir simulation modeling, followed by 4) design optimization using advanced non-linear methods. The focus of this paper is to discuss the hydraulic fracture simulation methods which are an integral part of this workflow. The workflow is deployed on a dataset from a multi-well pad completed in late 2018 targeting two landing zones in the Vaca Muerta shale play. On calibrated petrophysical and geomechanical model, hydraulic fracture simulations are conducted to map the stimulated rock around the wellbores. Finely gridded base model is utilized to capture the property variation between layers to estimate fracture height. The 3d discrete fracture network (DFN) built for the acreage is utilized to pick the natural fracture characteristics of the layers intersected by the wellbores. The methodology highlights advances over the past modeling approaches by including the variation of discrete fracture network between layers. The hydraulic fracture model in conjunction with reservoir flow simulation is used for history matching the production data. On the history matched model, a design of experiments (DOE) simulation study is conducted to quantify the impact of a wide range of well spacing and stimulation design variables. These simulations are facilitated by the recent deployments of cloud computing. Cloud computing allows parallel running of hundreds of hydraulic fracturing and reservoir simulations, thereby allowing testing of many combinations of stimulation deigns and well spacing and reducing the effective run time from 3 months on a local machine to 1 week on the cloud. Output from the parallel simulations are fitted with a proxy model to finally select the well spacing and stimulation design variables that offer the minimum unit development cost i.e. capital cost-$ per EUR-bbl. The workflow illustrates that stimulation design and well spacing are interlinked to each other and need to be optimized simultaneously to maximize the economics of an unconventional asset. Using the workflow, the team identified development designs which increase EUR of a development area by 50-100% and reduce the unit development cost ($/bbl-EUR) by 10-30%.
A new Chevron-led work flow is allowing the oil company to marry both organic field data with physics-based simulation models and machine-learning techniques to arrive at a more accurate prediction of well performance and, ultimately, a reliable production forecast for unconventional oil fields. Standard production forecast techniques for unconventional asset development rely mostly on field data, which can suffer from limitations in both data quality and quantity. Interpreting subsurface dynamics directly from field observations is also a challenge. Popular methods such as decline curve analysis can be hampered by limited data samples and too many variables. Reservoir simulation depends mostly on finding a good history match for the current field, but this method is resource-intensive and requires certain expertise.
Hampton, Thomas J. (Consultant) | El-Mandouh, Mohamed (Consultant) | Weber, Stevan (Consultant) | Thaker, Tirth (Computer Modelling Group) | Patel, K.. (Computer Modelling Group) | Macaul, Barclay (Computer Modelling Group) | Erdle, Jim (Computer Modelling Group)
Abstract Mathematical Models are needed to aid in defining, analyzing, and quantifying solutions to design and manage steam floods. This paper discusses two main modeling methods – analytical and numerical simulation. Decisions as to which method to use and when to use them, requires an understanding of assumptions used, strengths, and limitations of each method. This paper presents advantages and disadvantages through comparison of analytical vs simulation when reservoir characterization becomes progressively more complex (dip, layering, heterogeneity between injector/producer, and reservoir thickness).While there are many analytical models, three analytical models are used for this paper:Marx & Langenheim, Modified Neuman, and Jeff Jones.The simulator used was CMG Stars on single pattern on both 5 Spot and 9 Spot patterns and Case 6 of 9 patterns, 5-Spot. Results were obtained using 6 different cases of varying reservoir properties based on Marx & Langenheim, Modified Neuman, and Jeff Jones models.Simulation was also done on each of the 6 cases, using Modified Neuman steam rates and then on Jeff Jones Steam rates using 9-Spot and 5-Spot patterns.This was done on predictive basis on inputs provided, without adjusting or history matching on analog or historical performance.Optimization runs using Particle Swarm Optimization was applied on one case in minimizing SOR and maximize NPV. Conclusion from comparing cases is that simulation is needed for complex geology, heterogeneity, and changes in layering. Also, simulation can be used for maximizing economics using AI based optimization tool. While understanding limitations, the analytical models are good for quick looks such as screening, scoping design, some surveillance, and for conceptual understanding of basic steam flood on uniform geologic properties. This paper is innovative in comparison of analytical models and simulation modeling.Results that quantify differences of oil rate, SOR, and injection rates (Neuman and Jeff Jones) impact on recovery factors is presented.
Abstract This paper is a contribution to failure prediction of unconsolidated intervals that could have a negative impact on injection efficiency because of susceptibility to structural changes under fluid injection processes. In unconsolidated formations, formation fines may be subjected to drag forces by injected water because of poor cementation. This results in small grain moments, and continuation can result in a gradual increase in permeability and eventual development of washed-out or thief zones. This paper presents a new modeling approach using information from profile surveys and grain and pore size distribution to model the process of injection and the induced particle movement. The motivation came from field observations and realization of permeability increase from profile surveys and substantial fines movement, leading to an increase in rock permeability. A series of case studies based on realistic published data on pore and grain size distribution are included to demonstrate the estimated increases in formation permeability. In our modeling approach, once we establish the range of grain sizes that fits the criterion for particle movement, a probabilistic algorithm, developed for the study, is applied to track changes in porosity and associated variations in permeability. This algorithm, presented for the first time, considers a stochastic approach to monitor the reservoir particle movements, pore size exclusion by particle accumulation and their resultant changes in rock properties. For this methodology, we ignored potential effects of wettability and clay swelling, and considered perfect spheres to represent the various grain sizes. Predictions made using various realizations of channel formation and petrophysical alterations show the significance of having access to three sources of information; pore size distribution, grain size distribution, and profile surveys. Through inverse modeling using these pieces of information for a particular formation, we demonstrate how we can predict realistic changes and map rock transport properties.
Abstract This paper presents a large-scale experimental study of the compositional effect on produced bitumen properties in SAGD. The SAGD experiment used a sandpack in the cylindrical pressure vessel that was 1.22 m in length and 0.425 m in internal diameter. The pore volume of the sandpack was 58 L, and the porosity and permeability were 0.33 and 5.5 D, respectively. The sandpack was initially saturated with 93% bitumen and 7% deionized water. The SAGD experiment after preheating was operated mostly at a steam injection rate of 35 cm/min (cold-water equivalent) at 3600 kPa (244°C). The produced fluids (gas, oil, and water) were analyzed; e.g., ten oil samples were analyzed in terms of carbon number distribution (CND), the asphaltene content, density, and viscosity to investigate the compositional change of the produced bitumen. After the experiment, the sandpack was excavated and samples were taken for analysis of solid, water, oil, asphaltene, and sulfur contents. Experimental data (e.g., propagation of a steam chamber and production of oil and water) were history-matched by using a numerical reservoir simulator. Results showed that the produced bitumen was lighter and contained 1 to 5 wt% less asphaltenes than the original bitumen. Also, the remaining oil inside the steam chamber contained 6 wt% more asphaltenes. As a result, the produced bitumen was 1 to 6 kg/m less dense than the original bitumen. In the actual operations, bitumen is diluted with condensate to reduce the oil viscosity for pipeline shipping. This reduction in bitumen density corresponds to a reduction of the diluent cost by 5-10%. The produced bitumen became less dense with increasing steam-chamber volume. The history-matched simulation indicated that the progressively decreasing density of the produced bitumen can be attributed to the vaporization of the relatively volatile components in the remaining oil, and condensation of those components near the chamber edge. The history-matching also indicated that varying flow regimes (counter-current and co-current flow of water and oil) affected the oil recovery during in the SAGD experiment.