Digital Transformation across the geosciences and engineering domains is driven by the need to share data of increasing complexity and quantity as part of a distributed workload involving operators, partners, service companies and consultants. Unconventional plays pose an additional challenge since a large number of wells need to be planned, to be drilled in days. Companies operating multiple rigs, fracking crews and workover teams have a significant, time-constrained workload that requires rigorous and efficient execution.
To support a broadening of collaboration among all actors involved, such a work context would require all users to operate the same software suite, with the ability to periodically synchronize data repositories or work concurrently in a common cloud environment. However, the reality is that different companies or departments within a company use different software systems. The push to apply more analytics, new modeling technologies or broader, multi-disciplinary workflows to unconventional reservoirs, with software tools often sourced from new third-party providers, adds to the burden of more complex data flows and the concomitant data verification required from data consumers.
File formats exist to transfer specific data objects, for well data (e.g. LAS, DLIS), grid data, etc.. between different software platforms. These formats package the information for one data instance such as one well. With large unconventional resource play projects drilling hundreds of wells a year while managing production from hundreds more, individually writing and reading files for all new or updated wells and sharing this data across multiple systems is labor-intensive and prone to errors or omissions. This in turn puts the burden on the recipient of the data to verify data correctness and completeness.
An industry consortium has developed and published a data exchange standard that packages all the data pertaining to a hydrocarbon reservoir system. This includes subsurface measurements, interpretation, drilling, completion and production data such that a complete project can be exported from one system and read into another in a single pass. The process supports a full charting of data items and their metadata. This drastically reduces the workload at both ends of the transaction while offering significantly better guarantees that the set of data is complete and correctly referenced.
This has been proven in a full-scale live demonstration involving 6 different software systems each executing a step in a reservoir model enrichment workflow and writing out the data ready for the next software system to ingest it. The total time including the workflow steps and read-write operations totaled 45 minutes, significantly less time than would be needed to execute such a workflow using traditional I/O processes and file formats, and at considerably less risk of errors or omissions.
Summary Subsalt imaging at the Kepler field, located in deep water Gulf of Mexico, has proven to be challenging, largely because of suboptimal illumination and complicated ray bending caused by complex overburden salt. It is well understood that building an accurate velocity model is one of the prerequisites to obtain a good subsalt image. Multiazimuth seismic data sets not only help with improving subsalt illumination, they also benefit the velocity model building process by providing more data to constrain the updates. In this paper, we demonstrate the impact of multi-azimuth data and the Kepler salt geometry on subsalt illumination through a 3D ray-tracing and finite difference modeling illumination study. The integration of a top of salt (TOS) anomaly layer into the Kepler salt velocity model significantly improved the subsalt images and reduced base of salt depth uncertainties.
Elebiju, Bunmi (BP America) | Ariston, Pierre-Olivier (BP America) | van Gestel, Jean-Paul (BP America) | Murphy, Rachel (BP America) | Chakraborty, Samarjit (BP America) | Jansen, Kjetil (BP America) | Rodenberger, Douglas (Shell America) | White, Roy C. (Shell America) | Chen, Yongping (CGG) | Hren, David (CGG) | Hu, Lingli (CGG) | Huang, Yan (CGG)
Using the Kepler and Ariel Fields as a case study, this paper discusses the processing challenges and solutions applied to a 4D co-processing of Wide Azimuth Towed Streamer (WATS) on Narrow Azimuth Towed Streamer (NATS) data. Unlike a dedicated 4D acquisition, WATS on NATS 4D has relatively low repeatability in terms of acquisition geometry and bandwidth differences. All these factors can negatively impact the extraction of a meaningful 4D signal. In this paper, we demonstrate how processing techniques can help to increase repeatability and enhance 4D signal. We focus on the following 4D processing procedures: 4D co-binning, data matching, and post-migration co-denoise. Due largely to these techniques, the final co-processed volumes show an optimized 4D seismic signal with a median Normalized Root Mean Square (NRMS, which measures the repeatability between base and monitor. Details refer to Kragh and Christie, 2002) of 0.10 along the water bottom and 0.28 above the reservoir.