|Theme||Visible||Selectable||Appearance||Zoom Range (now: 0)|
Introduction Petroleum data analytics is a solid engineering application of data science in petroleum-engineering-related problems. The engineering application of data science is defined as the use of artificial intelligence and machine learning to model physical phenomena purely based on facts (e.g., field measurements and data). The main objective of this technology is the complete avoidance of assumptions, simplifications, preconceived notions, and biases. One of the major characteristics of petroleum data analytics is its incorporation of explainable artificial intelligence (XAI). While using actual field measurements as the main building blocks of modeling physical phenomena, petroleum data analytics incorporates several types of machine-learning algorithms, including artificial neural networks, fuzzy set theory, and evolutionary computing.
A new Chevron-led work flow is allowing the oil company to marry both organic field data with physics-based simulation models and machine-learning techniques to arrive at a more accurate prediction of well performance and, ultimately, a reliable production forecast for unconventional oil fields. Standard production forecast techniques for unconventional asset development rely mostly on field data, which can suffer from limitations in both data quality and quantity. Interpreting subsurface dynamics directly from field observations is also a challenge. Popular methods such as decline curve analysis can be hampered by limited data samples and too many variables. Reservoir simulation depends mostly on finding a good history match for the current field, but this method is resource-intensive and requires certain expertise.
Scientists from quantum computing company D-Wave have demonstrated that, using a method called quantum annealing, they could simulate some materials up to 3 million times faster than it would take with corresponding classical methods. Together with researchers from Google, the scientists set out to measure the speed of simulation in one of D-Wave's quantum annealing processors, and found that performance increased with both simulation size and problem difficulty, to reach a million-fold speedup over what could be achieved with a classical computer. The calculation that D-Wave and Google's teams tackled is a real-world problem; in fact, it has already been resolved by the 2016 winners of the Nobel Prize in Physics, Vadim Berezinskii, J. Michael Kosterlitz, and David Thouless, who studied the behavior of so-called exotic magnetism, which occurs in quantum magnetic systems. The Nobel Prize winners used advanced mathematical methods to describe, in the 1970s, the properties of a two-dimensional quantum magnet, which shed light on the strange--or exotic--states that matter can take on. Instead of proving quantum supremacy, which happens when a quantum computer runs a calculation that is impossible to resolve with classical means, D-Wave's latest research demonstrates that the company's quantum annealing processors can lead to a computational performance advantage.
Monte Carlo, is in fact, the name of the world-famous casino located in the eponymous district of the principality of Monaco on the world-famous French Riviera. It turns out that the casino inspired the minds of famous scientists to devise an intriguing mathematical technique for solving complex problems in statistics, numerical computing, and system simulation. One of the first and most famous uses of this technique was during the Manhattan Project when the chain-reaction dynamics in highly enriched uranium presented an unimaginably complex theoretical calculation to the scientists. Even the genius minds such as John Von Neumann, Stanislaw Ulam, and Nicholas Metropolis could not tackle it in the traditional way. They, therefore, turned to the wonderful world of random numbers and let these probabilistic quantities tame the originally intractable calculations.
Artificial intelligence (AI) is evolving--literally. Researchers have created software that borrows concepts from Darwinian evolution, including "survival of the fittest," to build AI programs that improve generation after generation without human input. The program replicated decades of AI research in a matter of days, and its designers think that, one day, it could discover new approaches to AI. "While most people were taking baby steps, they took a giant leap into the unknown," said Risto Miikkulainen, a computer scientist at The University of Texas at Austin, who was not involved with the work. "This is one of those papers that could launch a lot of future research." Building an AI algorithm takes time.
Joelle Pineau doesn't want science's reproducibility crisis to come to artificial intelligence (AI). Spurred by her frustration with difficulties recreating results from other research teams, Pineau, a machine-learning scientist at McGill University and Facebook in Montreal, Canada, is now spearheading a movement to get AI researchers to open up their methods and code to scrutiny. Alongside Koustuv Sinha, a PhD student at McGill, Pineau holds one of two new roles dedicated to reproducibility on the organizing committee for the Conference on Neural Information Processing Systems (NeurIPS), a major meeting for AI that this year attracted some 13,000 researchers. Ahead of this year's conference in Vancouver, Canada, from 8 to 14 December, the committee asked scientists to provide their code and fill in a checklist of methodological details for each paper submitted. They also ran a competition that challenged researchers to recreate each other's work.
Since its establishment, the SPE Baghdad Section has been keen to focus on empowering young professionals (YPs) by organizing local and international technical events that allow them to be aware of new technologies in the upstream oil and gas industry. As part of the many opportunities the section provides the students and YPs of the section, eight YPs were provided with a free-total pass for participating in the "XVI International Forum-Contest of Students and Young Researchers: Topical Issues for the Rational Use of Natural Resources." Held 17-19 June, the event was organized by the St. Petersburg Mining University and the International Competence Centre for Mining Engineering Education, under the auspices of UNESCO in Russia. As with most of the events these days, the meetings and presentations were held via web with outstanding participation by students, postgraduates, and young scientists from Russian and foreign universities, and employees of mining, geological, oil and gas, energy, engineering, and metallurgical companies, among others. Migrating the event to a web platform due to COVID-19 demonstrated to improve the success of the forum in both the number of contestants (up to a record of more than 800 people) and the number of attendees (up to 2000 people).
Shell has launched Studio X, which it describes as a first-of-its-kind open innovation studio to reimagine the future of work in exploration. The announcement came from Austin, Texas, where the studio will have its first physical location. According to Shell, Studio X will begin with three initial offerings focused on exploration and discovery. "Studio X brings tech advances at a challenging time for exploration and the global economy," said Mauhan Zonoozy, acting general manager of Studio X. "The technology within will lead to better ways of working in the exploration field--increasing collaboration and ultimately improving outcomes. Studio X will fuel big ideas, connect a network of global talent with remote work opportunities, empower and guide entrepreneurs and startups, and crowdsource bold solutions to make the ideas of today into the industry-leading tools of tomorrow."
In a quiet industrial park in suburban Toronto, there is a machine that eats carbon dioxide (CO2) and spits out fuel. This place, typically associated with its strip malls, ethnic and cultural diversity, and peaceful middle class life, might also soon be known as a hotbed of energy innovation. The project's code name is "Pond." A world away, at a world-class research institute in Bangalore, India, engineers have developed a completely different technology to convert CO2into industrial chemicals. They are motivated by the desire to begin tackling rising global CO2 emissions.
ExxonMobil and Mosaic Materials said on 26 August that they have entered into an agreement to explore the advancement of breakthrough technology that can remove carbon dioxide from emissions sources. Mosaic Materials has progressed research on a unique process that uses porous solids, known as metal/organic frameworks, to separate carbon dioxide from air or flue gas. The agreement with ExxonMobil will enable further discussion between the two companies to evaluate opportunities for industrial uses of the technology at scale. "New technologies in carbon capture will be critical enablers for us to meet growing energy demands while reducing emissions," said Vijay Swarup, vice president of research and development for ExxonMobil Research and Engineering Company. "Our agreement with Mosaic expands our carbon capture technology research portfolio, which is evaluating multiple pathways--including evaluation of carbonate fuel cells and direct air capture--to reduce costs and enable large-scale deployment. Adding Mosaic's approach will allow us to build on their work to evaluate the potential for this technology to have a meaningful impact in reducing carbon dioxide emissions."