Feature
SPE Disciplines
Geologic Time
Journal
Conference
Publisher
Author
Concept Tag
Country
Industry
Oilfield Places
File Type
Layer | Fill | Outline |
---|
Theme | Visible | Selectable | Appearance | Zoom Range (now: 0) |
---|
Fill | Stroke |
---|---|
For certain classes of problems in high-performance computing, all supercomputers have an unavoidable, and fatal bottleneck: Memory bandwidth. That is the argument made this week by one startup company at SC20, a supercomputing conference that takes place in a different city each year, but this year is being held as a virtual event given the COVID-19 pandemic. The company making that argument is Cerebras Systems, the artificial intelligence computer maker that contends its machine can achieve speed in solving problems that no existing system can. "We can solve this problem in an amount of time that no number of GPUs or CPUs can achieve," Cerebras's chief executive officer, Andrew Feldman, told ZDNet in an interview by Zoom. "This means the CS-1 for this work is the fastest machine ever built, and it's faster than any combination of clustering of other processors," he added.
The scope of emerging technology for the oil and gas industry could be summed up in just three areas: advanced computer programming, new physical toolsets, and alternative applications of upstream systems. An executive panel session held during the recent International Petroleum Technology Conference 2020 in Saudi Arabia featured examples of each category from several of the industry's most technologically capable firms along with candid thoughts on how they might impact the industry's people problem. Nasir Al-Naimi, Vice President of Petroleum Engineering and Development for Saudi Aramco, highlighted a number of ambitious technology initiatives that are poised to "revolutionize" his company's core exploration operations. Among them is a recently launched digital program called GeoDrive, which is an integrated geophysical imaging and modeling program designed for intensive exascale computing--considered to be one of the next major milestones for high-performance computing. "This platform improves image processing efficiency by 90, pushing the limits of image resolution," said Al-Naimi.
The scope of emerging technology for the oil and gas industry could be summed up in just three areas: advanced computer programming, new physical toolsets, and alternative applications of upstream systems. An executive panel session held during the recent International Petroleum Technology Conference 2020 in Saudi Arabia featured examples of each category from several of the industry's most technologically capable firms along with candid thoughts on how they might impact the industry's people problem. Nasir Al-Naimi, Vice President of Petroleum Engineering and Development for Saudi Aramco, highlighted a number of ambitious technology initiatives that are poised to "revolutionize" his company's core exploration operations. Among them is a recently launched digital program called GeoDrive, which is an integrated geophysical imaging and modeling program designed for intensive exascale computing--considered to be one of the next major milestones for high-performance computing. "This platform improves image processing efficiency by 90, pushing the limits of image resolution," said Al-Naimi.
As the application of Digital Rock Analysis (DRA) expands in industrial and academic settings, the need for digital investigation of more complex porous media, such as carbonates and tight rocks, has increased in the market. These materials exhibit a wide range of different mesoscale properties induced by their complex, multi-scale pore structures and their connectivity. The existence of complex pore space topology and wide pore size distributions requires computations to be done at high resolutions and in a large field of view. This means that the DRA of these samples must have three essential components of "parallel computing, multiscale modeling, and process-based reconstruction of 3D volumes." These are the three main pillars of Thermo Scientific e-Core Software, making it a unique solution for the characterization of more heterogeneous porous materials.
Some of the world's largest exploration and production companies say the big bets they have placed on high-performance computing over the past several years are set to pay off. Their confidence is bolstered by the increasing capacity of the supercomputers they have built to interpret massive amounts of seismic and geophysical data that would have been nearly impossible to properly analyze using computers considered state-of-the-art only a decade ago. And as a result of the major advancements being made each year in computing technology, the industry's reliance on computer science continues to increase as it earns its place alongside the hallmark upstream disciplines of geoscience and engineering. Several of the companies making significant investments in high-performance computing talked about the promise that this technology holds for their exploration strategies at IPTC. Total established itself as one of the leaders in this realm last March with the startup of a high-performance computer that it is using for seismic data imaging and to maximize future production by simulating how water, oil, and gas flow inside reservoirs.
Some of the world's largest exploration and production companies say the big bets they have placed on high-performance computing over the past several years are set to pay off. Their confidence is bolstered by the increasing capacity of the supercomputers they have built to interpret massive amounts of seismic and geophysical data that would have been nearly impossible to properly analyze using computers considered state-of-the-art only a decade ago. And as a result of the major advancements being made each year in computing technology, the industry's reliance on computer science continues to increase as it earns its place alongside the hallmark upstream disciplines of geoscience and engineering. Several of the companies making significant investments in high-performance computing talked about the promise that this technology holds for their exploration strategies at IPTC. Total established itself as one of the leaders in this realm last March with the startup of a high-performance computer that it is using for seismic data imaging and to maximize future production by simulating how water, oil, and gas flow inside reservoirs.
Eni revealed its new super computer, HPC5, during a ceremony on 6 February. "Today, Eni unveils a supercomputing system with key features which are unique in the industrial world," said Claudio Descalzi, Eni's chief executive officer. "This system is able to boost and even further refine the highly complex processes that support Eni's people in their activities and, therefore, accelerate our digital transformation. This is an important time in the path toward the energy transition. It's another step forward to the global goal that we share with our research and technology partners: making tomorrow's energy an even closer reality."
Oil and gas is in the midst of a pervasive digital transformation in which the industry is changing the way it manages assets, the way it interacts with customers, and the way it develops internal workflows. Perhaps one of the most significant impacts of this transformation, however, is the way in which companies characterize their subsurface data. Data volumes are growing at an exponential rate, and to handle these volumes, operators are looking at high-performance computing (HPC) solutions. "The oil and gas supply today is still delivered through very large engineering projects," said Detlef Hohl, chief scientist of computation and data science at Shell. "You have things like the deepwater Gulf of Mexico, things like our Prelude FLNG. These projects are designed to be operated and maintained with the help of high-performance computing."
Big data is one of the big buzzwords in oil and gas operations today. Operators cannot get enough of it. Managing a successful venture requires the ability to extract valuable information from massive data sets and process that information in a quick and efficient manner. High-performance computing (HPC) is critical in making these things happen. It can help operators optimize their field-development plans and integrate those field models with surface facilities models, minimizing risk and uncertainty and creating an environment for more assertive decision making on a project.
Large volumes of gas can be produced at high rates with conventional horizontal- or vertical-well configurations for long periods of time from some methane-hydrate accumulations by means of depressurization-induced dissociation. However, most assessments of hydrate production use simplified or reduced-scale 3D or 2D production simulations. This study used a message-passing-interface parallel code to make the first field-scale assessment of a large deep-ocean hydrate reservoir. Systems of up to 2.5 million gridblocks, running on thousands of supercomputing nodes, were required to simulate such a large system at the highest level of detail. The simulations begin to reveal the challenges inherent in producing from deep relatively cold systems with extensive water-bearing channels and connectivity to large aquifers.