We interpreted a series of single-well-chemical-tracer-tests (SWCTTs) estimating residual oil (SORW) to base high salinity waterflood, low salinity waterflood and subsequent polymer flood conducted on a Greater Burgan well. Interpretation of the tests requires history matching of the back-production of partitioning and non-partitioning tracers which is impacted by differing amounts of irreversible flow and differing amounts of dispersion as well as the amount of residual oil.
We applied the state-of-the-art chemical reservoir simulator (UTCHEM) and an assisted history matching tool (BP’s Top-Down-Reservoir-Modeling) to interpret the tests and accurately quantify uncertainty in residual oil saturations post high salinity, low salinity, and polymer floods. Two optimization algorithms (i.e., Genetic algorithm (GA) and Particle-Swarm-Optimization (PSO)-Mesh-Adaptive-Direct-Search (MADS) algorithms) were applied to better address the uncertainty.
Our results show a six saturation unit decrease in SORW post low salinity with no change to the SORW post polymer. This is in-line with our expectations - we expect no change in SORW post-polymer as the conventional HPAM, which does not exhibit visco-elastic behavior, was used in the test. We demonstrate that history matching the back-produced tracer profiles is a robust approach to estimate the SORW by showing that three-or four-layer simulation model assumption does not change the SORW estimated. We accounted for the uncertainty in partition-coefficient in our uncertainty estimates.
We present several innovations that improve history matching back-produced tracer profiles; hence, better SORW estimations (e.g., different level of dispersivity for individual simulation layers to account for different heterogeneity level as opposed to assuming a single dispersion for all layers). We generate more robust estimates of uncertainty by finding a range of alternative history matches all of which are consistent with the measured data.
The objective of this paper is to explore the benefits of using the Interactive Epoch-Era Analysis (IEEA) methodology for evaluating architectural changes in a trade space exploration study. In this paper a subsea tieback offshore Brazil will be used as reference case to investigate this premise from a full field development perspective.
An automated concept exploration tool is employed. It applies meta-heuristics to generate different offshore facilities concepts with varying building blocks. The interaction between reservoir behavior and facilities design is accounted for, meaning pressure and temperature losses throughout the system are taken into account in each concept differently. These concepts are ranked in terms of economic performance indicators (NPV, IRR, etc.), and each run with a given set of boundary conditions covers what is called an Epoch. This process is iterated for the whole life of field with a set of different boundary conditions, such as commercial aspects ($/bbl, $/MMBtu, market demand) and/or technological maturity aspects (TRL, novel technological concepts), generating what is called an Era. The whole data set is then evaluated in an interactive platform thru the Humans-In-the-Loop (HIL) process.
Model Based Systems Engineering (MBSE) is being employed successfully in other engineering fields outside the O&G context such as the aerospace and automotive industries. While digital tools have been identified as a potential key contributor to the future of O&G performance enhancement and further cost reductions, that is yet to be shown. This work intends to provide backing for that argument in one of the potential applications during early concept exploration phases by showing that quick high value assessments following an MBSE approach may be carried out, once significant effort has been put into proper development, verification and validation (V&V) of such digital tools.
While integrated models for asset development have long been a subject of interest for O&G operators, the application of Systems Engineering concepts to it has not yet been thoroughly explored. Systems Engineering provides a rigorous and proven method of dealing with complex systems that is highly applicable to offshore field developments. MBSE is the current State of the Art for capital intensive projects such as space exploration spacecrafts and rovers. Learning from these successful use cases and applying these methodologies in the development of digital technologies may provide a new set of tools in the belt of O&G operators Facilities Engineers and alike. The study case presented shows MBSE’s capability of capturing intrinsic non-linearities and specificities of each O&G field/location while ensuring project wide functional requirements are successfully met.
The traditional trial and error approach of history matching to obtain an accurate model requires engineers to control each uncertain parameter and can be quite time consuming and inefficient. However, automatic history matching (AHM), assisted by computers, is an efficient process to control a large number of parameters simultaneously by an algorithm that integrates a static model with dynamic data to minimize a misfit for improving reliability. It helps to reduce simulation run time as well.
Particle Swarm Optimization (PSO) is a population based stochastic algorithm that can explore parameter space combined with the least squares single objective function. The process of AHM can adopt parameterization and realization methods to reduce inverse problems. In this study, realizations of various reservoir properties such as porosity, net to gross, relative permeability, horizontal and vertical permeability, and aquifer size were chosen for controlling throughout the AHM. History matching was conducted to validate the efficiency of each method. The guidelines for optimized AHM with a stochastic algorithm are also disccussed.
The realization and parameterization methods improved matching results in a full-field application with resulting in a reduced misfit and in less. A stochastic algorithm generates multiple models to deduce control parameters to reduce a misfit. In this study we identified that PSO converged effectively with updated control parameters. The optimized AHM improved the accuracy of a full-field model although some misfit remained in the match to bottomhole pressure.
We found that updating with too many parameters makes the problem difficult to solve while using too few leads to false convergence. In addition, while the simulation run time is critical, a full-field simulation model with reduced computational overhead is benefitial.
In this study, we observed that the PSO was an efficient algorithm to update control parameters to reduce a misfit. Using the parameterization and realization as an assisted method helped find better results. Overall this study can be used as a guideline to optimize the process of history matching.
Time-lapse seismic monitoring is a powerful technique for reservoir management and the optimization of hydrocarbon recovery. In time-lapse seismic datasets, the difference in seismic properties across different vintages enables the detection of spatio-temporal changes in saturated properties and structure induced by production. The main objectives are (1) to identify bypass pay zones in time-lapse seismic data for the deepwater Amberjack field, located in the Gulf of Mexico, (2) confirm the identified bypass pay zones in the results of reservoir simulation, and (3) recommend well planning strategies to exploit these bypassed resources.
A high-fidelity seismic-to-simulation 4D workflow that incorporates seismic, petrophysics, petrophysical property modeling, and reservoir simulation was employed, which leveraged cross-discipline interaction, interpretation, and integration to extend asset management capabilities. The workflow addresses geology (well log interpretation and framework development), geophysics (seismic interpretation, velocity modeling, and seismic inversion), and petrophysical property modeling (earth models and co-located co-simulation of petrophysical properties with P-impedance from seismic inversion). An embedded petro-elastic model (PEM) in the reservoir simulator is then used to affiliate spatial dry rock properties with saturation properties to compute dynamic elastic properties, which can be related to multi-vintage P-impedance from time-lapse seismic inversion. In the absence of the requisite dry rock properties for the PEM, a small data engine is used to determine these absent properties using metaheuristic optimization techniques. Specifically, two particle swarm optimization (PSO) applications, including an exterior penalty function (EPF), are modified resulting in the development of nested and average methods, respectively. These methods simultaneously calculate the missing rock parameters (dry rock bulk modulus, shear modulus, and density) necessary for dynamic, embedded P-impedance calculation in the history-constrained reservoir simulation results. Afterward, a graphic-enabled method was devised to appropriately classify the threshold to discriminate non-reservoir (including bypassed pay) and reservoir from the P-impedance difference. Its results are compared to unsupervised learning (k-means clustering and hierarchical clustering). From seismic data, one can identify bypassed pay locations, which are confirmed from reservoir simulation after conducting a seismic-driven history match. Finally, infill wells are planned, and then modeled in the reservoir simulator.
Junwen, Wu (Sinopec Research Institute of Petroleum Exploration and Development) | Wenfeng, Jia (Sinopec Research Institute of Petroleum Engineering) | Rusheng, Zhang (Sinopec Research Institute of Petroleum Exploration and Development) | Xueqi, Cen (Sinopec Research Institute of Petroleum Exploration and Development) | Haibo, Wang (Sinopec Research Institute of Petroleum Exploration and Development) | Jun, Niu (Sinopec Research Institute of Petroleum Exploration and Development)
The high efficient foam unloading agent was developed to solve the problem of unloading of liquid loading gas well with high gas temperature, salinity and high concentration of H2S gas and gas condensate. The Gemini anionic surfactant with special comb structure was synthesized as foaming agent molecule, the modified nanoparticles with certain size and degree of hydrophobicity was adopted as solid foam stabilizer, and the fluorocarbon surfactant was designed and synthesised as gas condensate resistance components. The indoor experiment results show that the foam unloading agent showed good foaming and foam stabilizing ability when the temperature is as high as 150°C, salinity is up to 250000 ppm and H2S concentration up to 2000 ppm. Besides, the foam unloading agent present good liquid carrying ability when the volume fraction of gas condensate is as high as 50%. The field test of this foam unloading agent in Longfengshan north 201-XY well shows that, the average gas production increased from 7256 m3/day to 11329 m3/day, increased by 56%, the average differential pressure between tubing and casing dropped from 2.66 MPa to 2.38 MPa, fell by 10.5%, both liquid yield and gas production are obvious, which prove that the foam unloading agent can meet the demand of drainage gas recovery for high content gas condensate gas field.
Agent-based models (ABMs) provide a fast alternative to traditional partial differential equation (PDE)- based oil reservoir models by applying localized inexpensive simulations, rather than solving a partial differential equation at every time-step. However, while there have been theoretical and numerical results obtained with ABMs in social science applications, the accuracy of ABMs has not been analyzed in the context of oil reservoir modeling.
Some of the things that Sandy is designed to answer quickly include details around faults or the average porosity of a reservoir, which Laigle said is a “simple question to ask, but quite complex to answer.” In January, BP invested $5 million in the young company to help its upstream unit achieve a 90% reduction in the time its engineers spend on data collection, interpretation, and simulation. Behind Sandy are a number of programming elements that have been proven for years in the consumer sector. This includes the knowledge-graph technique that is core to Google’s ability to link relevant but unconnected pieces of information together with its search tool. In addition, the startup is creating a suite of intelligent agents to carry out domain-specific tasks with the data.
Vazquez, Oscar (Heriot Watt University ) | Ross, Gill (Chrysaor) | Jordan, Myles Martin (Nalco Champion) | Baskoro, Dionysius Angga Adhi (Heriot-Watt University) | Mackay, Eric (Heriot-Watt University) | Johnston, Clare (Nalco Champion) | Strachan, Alistair (Nalco Champion)
Oilfield-scale deposition is one of the important flow-assurance challenges facing the oil industry. There are a number of methods to mitigate oilfield scale, such as reducing sulfates in the injected brine, reducing water flow, removing damage by using dissolvers or physically by milling or reperforating, and inhibition, which is particularly recommended if a severe risk of sulfate-scale deposition is present. Inhibition consists of injecting a chemical that prevents the deposition of scale, either by stopping nucleation or by retarding crystal growth. The inhibiting chemicals are either injected in a dedicated continuous line or bullheaded as a batch treatment into the formation, commonly known as a scale-squeeze treatment. In general, scale-squeeze treatments consist of the following stages: preflush to condition the formation or act as a buffer to displace tubing fluids; the main treatment, where the main pill of chemical is injected; overflush to displace the chemical deep into the reservoir; a shut-in stage to allow further chemical retention; and placing the well back in production. The well will be protected as long as the concentration of the chemical in the produced brine is greater than a certain threshold, commonly known as minimum inhibitor concentration (MIC). This value is usually between 1 and 20 ppm. The most important factor in a squeeze-treatment design is the squeeze lifetime, which is determined by the volume of water or days of production where the chemical-return concentration is greater than the MIC.
The main purpose of this paper is to describe the automatic optimization of squeeze-treatment designs using an optimization algorithm, in particular particle-swarm optimization (PSO). The algorithm provides a number of optimal designs, which result in squeeze lifetimes close to the target. To determine the most efficient design of the optimal designs identified by the algorithm, the following objectives were considered: operational-deployment costs, chemical cost, total-injected-water volume, and squeeze-treatment lifetime. Operational-deployment costs include the support vessel, pump, and tank hire. There might not be a single design optimizing all objectives, and thus the problem becomes a multiobjective optimization. Therefore, a number of Pareto optimal solutions exist. These designs are not dominated by any other design and cannot be bettered. Calculating the Pareto is essential to identify the most efficient design (i.e., the most cost-effective design).
A major challenge in carbonate reservoirs is the highly-fractured nature of the rock. The flow rate may be high or low depending on the targeted fracture clusters. In addition, it is possible that flow rates vary from one region of the reservoir to another. Smart wells furnished with smart completion strategy presents great prospects to produce such reservoirs intelligently, thereby, helping to deal with heterogeneities rather smartly. It is established that early water break-through occurs when multi-lateral wells are completed with constant choke settings, and therefore one way to mitigate this problem is using smart completions that manage the unexpected production through fractures, thereby increasing ultimate recovery. The early water breakthrough is obvious because if a lateral section intersects a clusters of fracture zone, there is a possibility that these fractures may connect with the water zone that may trigger the breakthrough. This can be managed by preferentially regulating production from manifold laterals.
The evident communication among the various laterals of the mother bore raises difficulty in optimizing the production from the variable productivity intervals. In theory, the optimization scheme of smart completion involves different constraints, nevertheless, the settings of the smart inflow control valve (ICV) is the single most important parameter that may prove to be the differentiating factor between a high producing well to a poorly producing one. This study engrosses its effort on the reservoir engineering characteristics of finding the optimum choke setting that would lead to maximum recovery.
Computational Intelligence through Particle Swarm Optimization (PSO) is utilized as the integral algorithm to determine the optimal ICV configuration for a fishbone well in a naturally fractured carbonate reservoir. A commercial black oil simulator was used to determine the objective function; whose role here is to evaluate the fitness of a configuration of the choke; this was carried out under a workflow programmed in the MATLAB programming language that coupled the optimization algorithm with the numerical simulator. A single fishbone well, having 15 laterals was studied in order to see the effect of the fracture network on the water breakthrough and consequent impact on recovery.
Three different scenarios are developed to see the impact of optimization; a base case employing only multilateral well technology without the smart well completion, a smart well completion scheme with no optimization and finally the optimized smart well completion. The results very sequentially clarify the need for not only optimization but also highlights the role of intelligent completions for wells in the reservoir being studied. It is evident that without using smart wells, the water breakthrough is relatively earlier and produces less hydrocarbons, but as the use of smart wells is incorporated, the results start improving and for complete optimization scheme of the ICVs, it is observed that the recovery has increased by almost 80% from 21% to 38%. Moreover, the time to water breakthrough and eventually the cumulative water cut has also been managed quiet significantly.