|Theme||Visible||Selectable||Appearance||Zoom Range (now: 0)|
Abstract The Greater Enfield Project (GEP) is a challenging offshore oil development, designed to produce from the Laverda Canyon, Cimatti and Norton over Laverda oil fields. Six water injection wells are required to provide pressure support and sweep oil to three production wells in the Laverda Canyon and Cimatti oil accumulations to improve oil recovery. The GEP injection wells are a critical aspect of water flood design in a complex field, new to Woodside and with limited global benchmarks. A specific drilling and completion fluid system (Reservoir drilling fluid, completion fluid and chemical filter cake breaker) combined with a unique clean up and displacement technique have been adopted to provide high and sustainable matrix injection performance. While filter cake breakers have been previously used in the industry, they are typically combined with a flowback for filter cake removals. Filter cake clean up by means of flowback was discounted for GEP due to cost and the inability of some wells to naturally flow during early life. All GEP injection wells were completed in 2018-2019, one of which is globally the longest horizontal water injection well completed to date based on the Rushmore data base. Fieldwide injection commenced in July 2019 with favorable results. This paper summarises the key design aspects adopted to deliver successful matrix injection performance, presents the improvements implemented during offshore execution and provides an insight into the early life injection performance.
Morrison, Graeme (Woodside Energy Ltd)
A multi-chamber, finite-difference dynamic model assumed pore-scale reservoir properties and replicated laboratory processes to determine oil-water transition zone characteristics: (1) imbibition residual oil saturation to water flood (distributed as a function of rock properties); (2) imbibition oil-water relative permeability (according to both steady-state and unsteady-state calculations); and, (3) microscopic oil recovery factor as a function of height above free water level. A priori, scaled capillary pressure hysteresis and segregated flow conditions governed pore-scale flow behaviour (initial mobile saturation scaled 0-1) assuming a pipeline network analogue. Although an oil-water system was modelled the process could be extended to other fluid phases.
The dynamic model was divided into five chambers replicating two experiments. Flow between chambers was regulated by transmissibility multipliers (open/shut). Replicating a core displacement experiment, an oil source rock "drained" (flooded) the central core chamber to form the oil-water transition zone; water then imbibed into the core chamber displacing mobile oil into a shallower effluent chamber; capillary pressure hysteresis defined the imbibition residual oil saturation to water flood. In a second experiment, the central core chamber was subdivided into zones, each comprising five model layers, and the imbibition oil-water relative permeability was determined by flooding the quiesced core chamber situated between high/low-pressure chambers (source/sink). Both steady-state and unsteady-state relative permeability were determined across the oil-water transition zone.
The Digital Core Laboratory (DCL) determined the distribution of imbibition residual oil saturation as a function of: wettability assumption; rock properties; and, proximity to the free water level. For a range of wettability assumptions, imbibition oil-water relative permeability was determined across the transition zone, on a zonal basis, from top core chamber (top reservoir) down to the free water level.
Summary of Results: (1) The imbibition residual oil saturation (ISorw) decreased with depth toward the free water level, while the drainage irreducible water saturation (Swir) increased with depth; (2) ISorw was found to be strongly influenced by the wettability assumption, contrasting with the uniformity of Swir; (3) Microscopic oil recovery factor was determined across the transition zone as a function of both proximity to the free water level and the wettability assumption; except for the oil-wet case, microscopic oil recovery factor increased with depth toward the free water level.
The novel approach was the multi-chambered design of the finite difference model forming a Digital Core Laboratory (DCL). Tuning the model to a Special Core Analysis (SCAL) data set, with the objective of filling SCAL data gaps, is expected to be one application.
We showcase an innovative campaigning and business-focused approach to reservoir monitoring of multiple fields using 4D (time-lapse) seismic. Benefits obtained in terms of cost, speed and the quality of insights gained are discussed, in comparison with a piecemeal approach. Challenges and lessons learned are described, with a view to this approach becoming more widely adopted and allowing 4D monitoring to be extended to smaller or more marginal fields.
An offshore seismic acquisition campaign was planned and successfully executed for a sequence of four 4D monitor surveys for fields located within 250 km of each other on the greater Northwest Shelf of Australia. The four monitors were acquired in H1 2020 comprising (in this order): Pluto Gas Field M2 (second monitor), Brunello Gas Field M1 (first monitor), Laverda Oil Field M1 and Cimatti Oil Field M1.
Cost savings expected from campaigning were realised, despite three cyclones during operations, with success largely attributed to detailed pre-survey planning. Also important were the choice of vessel and planning for operational flexibility. The baseline surveys were diverse and required careful planning to achieve repeatability between vintages over each field, and to optimise the acquisition sequence – minimising time required to reconfigure the streamer spreads between surveys. The Cimatti baseline survey was acquired using a dual-vessel operation; modelling, combined with now-standard steerable streamers, showed a single-vessel monitor survey was feasible. These optimisations provided cost savings incremental to the principal economy of sharing vessel mobilisation costs across the whole campaign.
Both processing and evaluation (ongoing at the time of writing) are essentially separate per field, but follow a consistent approach. Processing is carried out by more than one contractor to debottleneck this phase, with products, including intermediate quality control (QC) volumes, delivered as pre-stack depth migrations. While full evaluation of the monitor surveys to static and dynamic reservoir model updates will continue beyond 2020, key initial reservoir insights are expected to emerge within days of processing completion, with some even earlier from QC volumes. Furthermore, concurrent 4D evaluations are expected to result in fruitful exchanges of ideas and technologies between fields.
Automated pattern recognition with basic, high-level coding can be readily applied to petroleum production surveillance to reduce the impact of equipment failure. Machine learning is an application of regression techniques that range in complexity from simple linear regressions to convolutional neural networks. This paper outlines a machine learning based solution that was developed for a common petroleum engineering problem.
A temporary proxy for downhole pressure measurements was developed after gauge failure on an offshore gas production well. A solution was found in the machine learning space by applying multivariate linear regression to represent relationships within the production system. The workflow presented is based on Python code using the open source SKLearn library. Readers should carry out their own independent assessment of the approach outlined in this paper (including the model development procedure pseudo code set out in
The method uses available production data (known conditions of pressure and temperature from the wellhead and further downstream, choke settings and well total mass flow rates) to predict an unknown downhole pressure. The failure of a downhole gauge was simulated by removing the downhole data from the dataset at a certain point in time. The machine learning model was trained using 19 months of well production data. The nine months that follow was then entered under gauge failure conditions (with downhole data removed), to predict downhole pressure from other production data. The result was a downhole pressure prediction within 0.2% (40 kPa) of the actual gauge measurement up to nine months after the simulated gauge failure.
The prediction was compared to downhole pressure estimations that were calculated with a conventional physical model. The machine learning model outperformed the conventional physical correlation over the test period. The model was validated as an adequate short-term replacement for downhole pressure measurement for an offshore gas well. The solution delayed disruption to the management of reserves, enabled the continuation of production forecasting and postponed subsea intervention.
This paper also provides a foundation for assisted trend analysis, in which a gauge that is identified as drifting from the long-term trend can aid in the detection of physical changes such as water breakthrough.
ABSTRACT A state-of-the-art inversion methodology is described in this case study to quantitatively integrate partially-stacked seismic data, well logs, geologic data and geostatistical information. Amplitude-versus-Angle (AVA) deterministic, stochastic and Wave-Equation based (WEB) inversion algorithms are used to characterize Intra-Triassic Mungaroo gas reservoirs located in the Carnarvon Basin, Western Australia. AVA deterministic, stochastic and WEB inversions, which combine the advantages of AVA analysis with those of seismic inversion, have provided quantitative information about the lateral continuity of the Fluvio-Deltaic reservoirs as well as the delineation of the gas reservoirs, based on the interpretation of inverted elastic properties and lithology and fluid-sensitive modulus attributes. AVA stochastic inversion provides more realistic and higher-resolution results than those obtained from analogous deterministic techniques and allows for uncertainty analysis. The quantitative use of rock/fluid information through AVA seismic data, coupled with the co-simulation of elastic/petrophysical properties, provides accurate 3D models of engineering properties such as porosity, volume of shale, and water saturation which can be directly used for static model building. Internal multiple scattering, transmission effects and mode conversion, considered as noise in linear inversion, become essential signal in WEB non-linear inversion; accordingly, WEB compressibility showed increased sensitivity for residual/live Gas discrimination than conventional inversions. Presentation Date: Tuesday, September 17, 2019 Session Start Time: 8:30 AM Presentation Start Time: 9:45 AM Location: 217A Presentation Type: Oral
Subsea production facilities may be “over-designed” if the worst-case “Design” conditions are included in the Flow Assurance analysis as the most onerous cases usually have low frequency of occurrence. With new frontier developments and challenging project economics, over design is no longer acceptable as the “default solution”. Risk-based flow assurance design employs a probabilistic analysis approach which allows for avoidance of over-design by quantifying key uncertainties and identifying major flow assurance risks through Monte Carlo simulations. Possible applications of risk-based decision making in flow assurance analyses include the sizing of flowlines, thermal insulation and surge capacity (slug catcher sizing), the determination of hydrate blockage formation likelihood, the definition of hydrate management strategy, the quantifying of hydrate inhibitor dosage requirement, and the optimisation of network systems.
In this paper, the application of probabilistic approach in the optimisation of flowline insulation requirement using a Risk Management and Optimisation (RMO) tool is discussed.
For a new phase of the B-field development (called “Phase 2” in this paper), new wells will be tied back to a new B-Manifold, and the fluids transported via a new insulated rigid CRA subsea flowline to an existing crossover manifold (A-XOM) where it is commingled with production fluids from the A-field. Combined production is transported via two existing Phase 1 flowlines to an offshore gas processing platform (GPP). Gas from the platform is subsequently sent to an onshore plant. A simplified field layout is provided in Figure 1.
Charlton, Thomas B. (The University of Western Australia) | Kegg, Stuart (Woodside Energy Ltd) | Morgan, Julie E. P. (Woodside Energy Ltd) | Zerpa, Luis E. (Colorado School of Mines) | Koh, Carolyn A. (Colorado School of Mines) | May, Eric F. (The University of Western Australia) | Aman, Zachary M. (The University of Western Australia)
Abstract This study provides valuable insights into hydrate management strategies as the industry transitions away from complete hydrate avoidance, particularly for the development of deep-water reservoirs with stricter economic margins. Transient simulation tools, such as the deployed hydrate deposition model, extend our ability to estimate blockage likelihood from heuristics to quantitative predictions. The model is applied to an insulated subsea tieback to identify the optimal no-touch-time (NTT) and depressurization pressure (DPP) following an unplanned shutdown. Two water-production scenarios are considered, from the lowest expected to the highest manageable rates. A complete hydrate blockage is predicted when the NTT was extended several hours beyond the nominal value for the highest water-to-gas ratio (WGR). Complete blockages are predicted for both low and high WGRs when the flowline is only partially depressurized, however, longer cooldown times for the high WGR case (due to greater volumes of residual liquids) meant a blockage took more than twice as long to occur than for the low WGR case. Fully depressurized restarts are both difficult and time consuming, leading to hydrate volume fractions (with respect to the pipe volume) exceeding 30 vol.%. An alternative hydrate management strategy is identified for cases with high volumes of water production, in which the flowline is only partially depressurized once the nominal NTT has elapsed, utilising the increased heat capacity of residual liquids. This reduces the quantity of gas sent to flare and simplifies the restart procedure.
Abstract Uncertainty is present at every stage of the subsurface modelling workflow and understanding it is an ongoing challenge for the petroleum industry. Quantifying this uncertainty is a rapidly growing field of study as increasingly available high-performance computing enables the application of traditional statistical methods to this problem. However, the extension of these methods to spatial data remains a challenge for which there is no immediate solution. This paper describes the use of data analytics techniques to incorporate spatial uncertainty in reservoir surfaces into subsurface modelling. A metric usually applied in image analytics, the Modified Hausdorff Distance, is adapted for this purpose. The workflow involves sampling the domain of possible surface realisations, characterising them using this metric and determining the most efficient subset to represent the entire data set. The value of this process is that the selected subset captures spatial uncertainty in the surface rather than only gross rock volume. The proposed technique proved to be a simple process that was able to easily select these surfaces from a stochastically generated set and has been successfully applied to the top reservoir surfaces in two fields.
Hardy, Madeline (Woodside Energy Ltd) | Baker, Mark (Woodside Energy Ltd) | Robson, Alex (Woodside Energy Ltd) | Williams, Jackson (Woodside Energy Ltd) | Murphy, Chris (Woodside Energy Ltd) | O'Sullivan, Liam (Woodside Energy Ltd)
Abstract Insights from appraisal well tests can take months to incorporate into subsurface modelling, causing delays to development planning and resulting in key decisions being made using incomplete data and sub-optimal methods. This is due to the time-consuming process of updating or rebuilding reservoir models, simulating them and subsequently analysing the results. In this project, a combination of automated geomodelling, rapid dynamic simulation and statistical analysis were applied to reduce the time to insights from months to days. Well test pressure data was used to condition a suite of reservoir models and evaluate the impact on the optimal development scenario. The application of this process increased confidence in the decision and reduced the modelled probability of low-side outcomes. In addition, we trialled a process to deliver an improvement to the geological understanding of the field through a reduction in the model uncertainties. We also discuss an extension of this concept to perform a robust value-of-information assessment of appraisal or development planning decisions.
Abstract In an era of automated workflow-assisted dynamic modelling, Special Core Analysis (SCAL) parameters require updating for each static realisation and evaluation at a quantifiable, probabilistic level-of-certainty. Additionally, SCAL data gaps combined with limited reliable SCAL data drive the need to establish trends and correlations from analogues. SCAL parameters from analogue fields were selected and filtered by depositional environment and laboratory experiment type (centrifuge versus displacement). These analogue SCAL parameters were allocated to statistical bins defined by absolute permeability ranges. Statistical analysis of each SCAL parameter allocated to each permeability bin produced a probability distribution discretised by percentile. Multi-variable linear regression (MVLR) was then implemented to correlate each SCAL parameter, as the response variable, to input variables absolute permeability and percentile. SCAL correlations of reasonable to excellent quality were obtained. The depositional environment was of second order influence in establishing these SCAL correlations. This was due to the selection of core plugs for laboratory analysis from layers of similar quality irrespective of the depositional environment, highlighting the need to select samples characterising a range of lithology and reservoir quality. Centrifuge experiments of water displacing gas were discarded as unreliable due to the compression of the gas phase by the experimental technique. The multi-variable linear regression methodology enabled SCAL parameters to be determined as a function of both absolute permeability and probability. This approach should enable an automated implementation of SCAL parameters within each dynamic model realisation.