Tanaka, Shusei (Chevron Energy Technology Company) | Wang, Zhenzhen (Chevron Energy Technology Company) | Dehghani, Kaveh (Chevron Energy Technology Company) | He, Jincong (Chevron Energy Technology Company) | Velusamy, Baskar (Chevron Energy Technology Company) | Wen, Xian-Huan (Chevron Energy Technology Company)
Field development optimization for oil and gas reservoirs is typically challenging due to large number of control parameters, model complexity, as well as subsurface uncertainties. In this study, we propose a joint field development and well control optimization workflow using robust parameterization technique and demonstrate its application through a offshore oil field development.
Traditionally, using simulation models for optimization of field development plan was considered time and cost prohibitive when incorporating models to cover range of uncertainties in reservoir properties. Consequently, the problem was simplified by reducing the number of control parameters through multi-disciplinary workflows. In this paper, we aim to optimize field development strategy by simultaneously controlling topside facility, number of wells, their trajectories, drilling sequence, and completion strategy etc., considering subsurface uncertainties and constraints. To achieve this, we used our next generation reservoir simulator and commercial cloud computing to explore the possibility of achieving an optimized development scenario within reasonable time and cost constraints.
We have applied the proposed workflow to the Olympus field case, which is an optimization benchmarking problem set up by Netherland Organization for Applied Scientific Research (TNO) using a synthetic North-sea type reservoir. Our objective is to improve the net present value (NPV) after 20 years of operation by controlling the number and location of platforms, number of injectors and producers as well as their trajectories and drilling sequence. The large number of control parameters and subsurface uncertainties make the optimization process challenging. Three optimization techniques, genetic algorithm (GA), particle swarm optimization (PSO) and ensemble-based optimization (EnOpt) were tested and their performances were compared. Best results in terms of NPV improvement was obtained by using the mixed-integer Genetic Algorithm method. More than ten thousand simulation runs were required by the method to reach to optimal development of well location, trajectory, drilling sequence etc. This was made possible by utilizing a high performance parallel simulator and cloud computing. The estimated cost of the commercial cloud service is almost negligible compared with the improvement in the economic value of the optimized asset development plan. The developed workflow and parameterization technique are flexible in well trajectory configuration and completion design allowing application to primary depletion as well as waterflooding.
The capacitance-resistance model (CRM) is an analytical tool that uses only injection and production rate to quantify interwell connectivity and response time during a waterflood. It has been widely used in conventional waterfloods for reservoir characterization, performance evaluation and optimization. Heavy oil waterfloods introduce challenges to the application of CRM due to the high mobility ratio and its rapid variation as a function of waterflood maturity. Using conceptual reservoir models and sensitivity studies, we provide guidelines for application of CRM in heavy oil waterfloods. We illustrate our approach in two heavy oil fields under waterflood. In heavy oil waterfloods, it is observed that interwell connectivity and response time varied over time, especially right before and after water breakthrough. The magnitude of CRM parameter variation is a function of viscosity ratio between water and oil, flood maturity, and contrast of flow and storage capacity of the flow units. Case studies of heavy oil watetrflood (one mature, one immature) showed that CRM can be used for waterflood analysis, and forecasting. In the immature flood with one injector and two producers and high permeability contrast, the most extreme variation of CRM parameters was observed.
Li, Boxiao (Chevron Energy Technology Company) | Bhark, Eric W. (Chevron Asia Pacific E&P Company) | Gross, Stephen J. (Chevron Energy Technology Company) | Billiter, Travis C. (Chevron Energy Technology Company) | Dehghani, Kaveh (Chevron Energy Technology Company)
Assisted history matching (AHM) using Design of Experiments (DoE) is one of the most commonly applied history matching techniques in the oil and gas industry. When applied properly, this stochastic method finds a representative ensemble of history-matched reservoir models for probabilistic uncertainty analysis of production forecast. Although DoE-based AHM is straightforward in concept, it can be misused in practice because the workflow involves many statistical and modeling principles that should be followed rigorously. In this paper, the entire DoE-based AHM workflow is demonstrated in a coherent and comprehensive case study that is divided in seven key stages: problem framing, sensitivity analysis, proxy building, Monte-Carlo simulation, history-match filtering, production forecast, and representative model selection. Best practices of each stage are summarized to help reservoir management (RM) engineers understand and apply this powerful workflow for reliable history matching and probabilistic production forecasting. One major difficulty in any history matching method is to define the history-match tolerance, which reflects the engineer's comfort level of calling a reservoir model "history-matched" even though the difference between simulated and observed production data is not zero. It is a compromise to the intrinsic and unavoidable imperfectness of reservoir model construction, data measurement, and proxy creation. A practical procedure is provided to help engineers define the history-match tolerance considering the model, data-measurement, and proxy errors.
Kim, Do Hoon (Chevron Energy Technology Company) | Alexis, Dennis (Chevron Energy Technology Company) | New, Peter (Chevron Upstream Europe) | Jackson, Adam C (Chevron Upstream Europe) | Espinosa, David (Chevron Energy Technology Company) | Isbell, Taylor Jordan (Chevron Energy Technology Company) | Poulsen, Anette (Chevron Upstream Europe, Chevron Energy Technology Company) | McKilligan, Derek (Chevron Upstream Europe) | Salman, Mohamad (Chevron Energy Technology Company, University of Houston) | Malik, Taimur (Chevron Energy Technology Company) | Thach, Sophany (Chevron Energy Technology Company) | Dwarakanath, Varadarajan (Chevron Energy Technology Company)
Polymer mixing is often challenging under offshore conditions due to space constraints. A theoretical approach is required to better understand the drivers for polymer hydration and design optimal field mixing systems. We share a novel theoretical approach to gain insights into the energy required for optimum mixing of novel liquid polymers. We present a new parameter, "Specific Mixing Energy" that is measured under both lab and field mixing conditions and can be used to scale-up laboratory mixing. We developed a simplified laboratory mixing process for novel liquid polymer that provided acceptable viscosity yield, filtration ratio (FR), and non-plugging behavior during injectivity tests in a surrogate core. A FR less than 1.5 using a 1.2 μm filter at 1 bar was considered acceptable for inverted polymer quality. We developed estimates for specific mixing energy required for lab polymer inversion to achieve these stringent FR standards and comparable viscosity yield. We then conducted yard trials with both single-stage and dual-stage mixing of the novel liquid polymer and developed correlations for specific mixing energy under dynamic conditions. Based upon the results of lab and yard trials, we tested the approach in a field injectivity test. The FR and viscosity were also correlated to a specific mixing energy to establish the desired operating window range from laboratory to field-scale applications. Such information can be used to enhance EOR applications using liquid polymers in offshore environments.
Asset steady-state modeling is frequently done by separate teams of engineers working on asset subsystem, production, and processing facility design and asset operation management. Design criteria shared between the teams are based on a limited set of predefined discrete assumptions for each subsystem. Currently, the commonly used modeling approach fails to account for subsystem interdependencies and does not enable assessment of changing conditions across the asset life cycle. This gap often results in suboptimal facilities design for the asset and cost overruns and/or lost production. Overall asset management decision-making tools should be based on reinforcing the consistence of information used across the whole asset. Integrated modeling helps to make informed decisions by considering the combined effect of reservoir uncertainties, well placement, surface network, and process facility on overall asset design, production, and management. This paper describes how a full integrated asset model is built and used as an effective decision support tool to help optimize overall asset design and operational performance management.
A full integrated model is built using typical industry-known commercial simulator packages for each subsystem and field conditions. It includes steady-state models for reservoirs, wells, production and injection networks, and processing facilities. A platform, which orchestrates data connectivity and integration while allowing subsystem applications to maintain their dependency in thermodynamic properties and equations of state solving, is used for transferring data and controlling variables within the subsystems. Data from a major capital project in the early stages of development were used to pilot test the technique and stress test the consistency of forecast production across the subsystems for whole asset impact due to a change in an individual subsystem such as reservoir uncertainties, well placement and scheduling, surface network operating conditions, and individual subsystem constraints."
The integrated multifield network model provided realistic optimal operating conditions and long-term production forecasts of oil production, sales gas, and water injection requirements by incorporating the physics at appropriate levels to ensure higher accuracies than discrete models.
Integrated asset modeling uses information consistently across the entire asset and includes subsystem interdependencies on overall asset steady-state operating conditions. Integrated modeling provides the opportunity for independent teams of facilities engineers, production engineers, and reservoir engineers to collaborate as a unified team while maintaining specific expert's needs and resources.
The asset integrated model technique offers a novel and versatile capability of evaluating diverse operational scenarios to find optimal settings for short- and long-term asset production needs.
Acid fracture conductivity experiments provide information valuable to effective design of acid fracture stimulation treatments in carbonates. They are complex tests, and a number of procedural details must be taken into consideration in order to upscale results to an adequate representation of well-scale acid fracture behavior for well productivity predictions. This paper focuses on a study performed on analog, quarried limestone core samples and a small number of reservoir core samples from the Unit 2 formation of Kazakhstan's Tengiz field to understand the impact of different acid fluid systems and procedural steps on acid fracture conductivity.
The physical structure of the etched channels is observed to be impacted by the nature of the fluid and has a strong impact on the conductivity. The quality and uniformity amongst the core samples has a critical impact on the measurements and is assessed. The residence time of the acid in the acid-etched experimental channel is small, and the equivalency of acid volumes injected at the experimental scale to the large acid volumes injected into a well-scale fracture channel is considered. Special consideration is given to the procedure of applying stress to and measuring conductivity of the experimental acid fracture channel which may result in large-scale mechanical failures of the core sample, preventing a high-quality measurement of conductivity following injection of a closed fracture acidizing stage.
This body of work discusses the design options and challenges which play a role in defining the testing strategy for an acid fracture conductivity study. Results demonstrate that with selection of appropriate fluid systems, acid fracture conductivity can be retained up to a closure stress of 6,000-7,000 psi in the Tengiz Unit 2 reservoir. A modified stress ramp-up procedure to improve closed fracture acidizing conductivity testing results obtained through the testing program is presented.
Hahn, Ruth (Chevron Energy Technology Company) | Spilker, Kerry (Chevron Energy Technology Company) | Alexis, Dennis (Chevron Energy Technology Company) | Linnemeyer, Harry (Chevron Energy Technology Company) | Malik, Taimur (Chevron Energy Technology Company) | Dwarakanath, Varadarajan (Chevron Energy Technology Company)
Current chemical EOR technologies, such as surfactant polymer flooding, have limitations, such as, high temperature, high salinity, and/or low permeability reservoirs. Low tension foam flooding, however, can be performed at high or low temperatures, salinities, and permeabilities. Low tension foam reduces residual oil saturation and increases sweep efficiency. The current literature shows the effect of foam in corefloods and limited 2-D micro-model experiments. We investigate the ability of low-tension foams to successfully divert flow into lower permeability zones in both communicating and non-communicating heterogeneous systems.
During this laboratory study, a surfactant that has shown success for foam applications in the field was evaluated and used as a baseline to compare and evaluate low tension foam formulations. We measured resistance factors under constant pressure and constant flow conditions. In addition, we established benchmarks for flow diversion. The low-tension formulations were modified to display foaming abilities and compared against the baseline surfactant benchmarks. Phase behavior experiments were conducted to determine the oil solubilization ratio of each surfactant formulation. Foam floods with surfactant formulations with favorable solubilization ratios were conducted in heterogeneous systems to determine sweep efficiency and oil recovery.
Surfactant formulations were developed that showed favorable apparent viscosities during
Wehunt, C. Dean (Chevron North America Exploration and Production Company) | Lattimer, Stefan K. K. (Chevron Europe, Eurasia, and Middle East Exploration and Production Company) | McDuff, Darren R. (Chevron Energy Technology Company)
In this paper, we provide an update on recent advances for and summarize global experiences with dendritic-acidizing (DA) methods, or acid tunneling. We include both coiled-tubing (CT) deployed methods and non-CT methods, and discuss process limitations, candidate-selection criteria, job-design factors, operational learnings, risks, and surveillance requirements and opportunities. A comprehensive review of published information is provided for three different tunneling methods along with relevant information for several other tunneling methods. This literature information is supplemented by depth, temperature, and pressure records for the three processes, which are discussed in detail. Execution factors such as logistics required, length of time required, and volumes of acid and other fluids used are also compared for the three methods.
Previous papers have focused on only one of the methods, whereas we will discuss acid-job optimization, process risks, and surveillance requirements for multiple acid-tunneling methods in substantially greater depth than have past authors. The three methods detailed in this paper are all viable but may have different niches. Differences in the job counts for the different methods are easily explained by differences in process vintages, execution speeds, and depth limitations. Previous optimization efforts were focused on tunnel creation but not acid-job effectiveness in terms of the wormholes generated adjacent to the tunnels; however, some progress is now being made in that regard. There are differences in the processes regarding pushing or pulling the jetting nozzles into the tunnels, and differences in resulting tunnel trajectories. Prejob caliper data are more critical for one of the processes than for the others, and there are significant differences in ability to measure or control tunnel direction. The tunneling tools have different sizes, but when toolsize alternatives are available, the larger tool sizes offer no clear advantages to the operator. Useful risk-mitigation measures are also discussed, and a comprehensive bibliography is included to facilitate further examination of the technology alternatives by other petroleum-industry professionals.
ABSTRACT: Formation pressures, stresses and failure condition analysis is critical in the planning and development of any shale or “unconventional” play. Commonly encountered wellbore stability problems and optimization of hydraulic fracturing treatments depend on the accuracy of these estimations. This paper introduces a case study located in an active tectonic setting, the Neuquén Basin of western Argentina, with special focus on the Vaca Muerta shale interval. Geological controls on spatial property distributions are discussed. The paper describes the workflow used for building 1D mechanical earth models, showing the approaches used for estimating overpressure and its causes, for assessing material anisotropy, and for deriving the poroelastic parameters needed for stress estimation. Results of the 1D models are compared with results from a 3D finite element model that was built using the “nested” or sub-modeling approach. The model was calibrated using failure analysis on breakouts and instantaneous shut in pressure values from pre-frac tests.
Understanding the spatial distribution of material properties, pore pressure and in-situ stresses is critical for the efficient operation of unconventional fields. This information provides the main input for applications such as wellbore stability analyses and design of hydraulic fracturing treatments, which are essential for the planning and execution of production strategies for this drilling and completion intensive type of operations.
Unconventional plays all over the world share some common challenges, such as the difficulty to take direct measurements of formation pressure or in-situ stress. The usually layered nature of both reservoir and overburden rocks makes the typical assumption of material isotropy not applicable in most cases, which further complicates rock characterization.
In this paper, we show a complete workflow for building a consistent set of pore pressure, materials and stress models both in 1D (well-based or log-based approach) and 3D (finite-element approach) for unconventional fields. For each step, we present the main assumptions, preferred computation algorithm and usage of calibration data to validate the model(s) outputs.
Zhang, Yanbin (Chevron Energy Technology Company) | He, Jincong (Chevron Energy Technology Company) | Yang, Changdong (Chevron Energy Technology Company) | Xie, Jiang (Chevron Energy Technology Company) | Fitzmorris, Robert (Chevron Energy Technology Company) | Wen, Xian-Huan (Chevron Energy Technology Company)
We developed a physics-based data-driven model for history matching, prediction, and characterization of unconventional reservoirs. It uses 1D numerical simulation to approximate 3D problems. The 1D simulation is formulated in a dimensionless space by introducing a new diffusive diagnostic function (DDF). For radial and linear flow, the DDF is shown analytically to be a straight line with a positive or zero slope. Without any assumption of flow regime, the DDF can be obtained in a data-driven manner by means of history matching using the ensemble smoother with multiple data assimilation (ES-MDA). The history-matched ensemble of DDFs offers diagnostic characteristics and probabilistic predictions for unconventional reservoirs.