Bulk Data Sharing Process Improves Collaboration and Saves Time While Increasing Trust in Data Accuracy

Neri, Philip (Energistics Consortium Inc.)

OnePetro 

Abstract

Digital Transformation across the geosciences and engineering domains is driven by the need to share data of increasing complexity and quantity as part of a distributed workload involving operators, partners, service companies and consultants. Unconventional plays pose an additional challenge since a large number of wells need to be planned, to be drilled in days. Companies operating multiple rigs, fracking crews and workover teams have a significant, time-constrained workload that requires rigorous and efficient execution.

To support a broadening of collaboration among all actors involved, such a work context would require all users to operate the same software suite, with the ability to periodically synchronize data repositories or work concurrently in a common cloud environment. However, the reality is that different companies or departments within a company use different software systems. The push to apply more analytics, new modeling technologies or broader, multi-disciplinary workflows to unconventional reservoirs, with software tools often sourced from new third-party providers, adds to the burden of more complex data flows and the concomitant data verification required from data consumers.

File formats exist to transfer specific data objects, for well data (e.g. LAS, DLIS), grid data, etc.. between different software platforms. These formats package the information for one data instance such as one well. With large unconventional resource play projects drilling hundreds of wells a year while managing production from hundreds more, individually writing and reading files for all new or updated wells and sharing this data across multiple systems is labor-intensive and prone to errors or omissions. This in turn puts the burden on the recipient of the data to verify data correctness and completeness.

An industry consortium has developed and published a data exchange standard that packages all the data pertaining to a hydrocarbon reservoir system. This includes subsurface measurements, interpretation, drilling, completion and production data such that a complete project can be exported from one system and read into another in a single pass. The process supports a full charting of data items and their metadata. This drastically reduces the workload at both ends of the transaction while offering significantly better guarantees that the set of data is complete and correctly referenced.

This has been proven in a full-scale live demonstration involving 6 different software systems each executing a step in a reservoir model enrichment workflow and writing out the data ready for the next software system to ingest it. The total time including the workflow steps and read-write operations totaled 45 minutes, significantly less time than would be needed to execute such a workflow using traditional I/O processes and file formats, and at considerably less risk of errors or omissions.