|Theme||Visible||Selectable||Appearance||Zoom Range (now: 0)|
While Schlumberger, Halliburton, and Baker Hughes focus their legacy technology and service technology portfolios on driving up efficiency, driving down cost, and making current sources of energy less carbon intensive, the three companies diverge on their approaches to scaling up development and deployment of breakthrough clean energy technologies. Halliburton and Schlumberger have carved out new, small businesses. Schlumberger's is an umbrella for a portfolio of ventures. Baker Hughes, on the other hand, weaves clean energy technology development through and across its four vertical product companies and product and service lines with cross-functional teams and guidance from its energy transition function. Halliburton's most significant exposure in clean energy sectors likely will come through its technology incubator/accelerator, Halliburton Labs, through which the company provides intellectual and financial support for the development of promising clean-energy technologies and helps participants scale their businesses.
Schlumberger and IBM announced the industry's first commercial hybrid cloud data management for the OSDU data platform. The hybrid cloud offering is designed to expand access to customers globally, including those in locations where data residency requirements and local regulations may affect the use of a global public cloud. "This collaboration is a game changer for energy operators to drive higher performance and greater efficiencies by now enabling integrated work flows and innovation using AI. The hybrid cloud solution allows clients to maintain the sovereignty of their data and also gives them options as to how they choose to leverage the solution, with the freedom to deploy on a range of infrastructures or a regional cloud provider," said Manish Chawla, global industry managing director for energy, resources, and manufacturing at IBM. The platform will provide energy operators with full interoperability, making their data accessible by any application within their exploration and production (E&P) environment through the OSDU common data standard to enable easy sharing of information between teams.
Schlumberger announced on 22 June that it has joined the decarbonization effort that is sweeping the upstream oil and gas industry with its plans to achieve net-zero greenhouse gas (GHG) emissions by 2050. The world's largest oilfield service company by market capitalization spent the last 18 months analyzing the path forward, which it said accounts for the full the value chain of its business (i.e., Scope 1, 2, and 3 emissions). "There is a new industry imperative to address climate change while meeting the demand for energy both today and in the long term, sustainably. We have a 2050 net-zero carbon emissions ambition which I believe is unique in our industry due to our capabilities as a technology company and our culture grounded in science. This reinforces our commitment to unlocking access to energy, for the benefit of all," Olivier Le Peuch, Schlumberger's chief executive officer, said in the announcement.
Schlumberger and Panasonic have announced that they will collaborate on a new battery-grade-lithium production process that they say will pave the way for improved lithium production to help meet the expected surge in demand from the fast-growing global electric vehicle (EV) market. The announcement came from the Schlumberger New Energy arm of Schlumberger and from Panasonic Energy of North America, a division of Panasonic Corporation of North America. The lithium-extraction and -production process will be used by Schlumberger at the Nevada pilot plant of its Neolith Energy venture. According to Schlumberger, Neolith Energy's approach uses a differentiated direct-lithium-extraction process to produce high-purity, battery-grade lithium material while reducing production time from more than a year to weeks. The company also said the process significantly reduces groundwater use and physical footprint vs. conventional evaporative methods of extracting lithium.
Digital drilling experts spend a lot of time wondering "what was the driller thinking?" They are not being sarcastic. The question matters for those doing analysis or writing control algorithms because the significance of readings, such as level of torque at any moment, depends on what the driller is doing at the time. The interpretation is different if the rig is drilling, where significant torque is required, or reaming, where the resistance is likely minimal. "During the reaming process, a spike in torque indicates something altogether different from a similar spike while drilling. So, the machine must recognize at least these two states: drilling and reaming," said Fred Florence, a drilling consultant.
In earlier days, the main technology developments were mostly related to the materials, such as fluids and proppants, and their characterizations. In recent years, more advancements have been made in tools, engineering processes, and analyses. In a cased-hole fracturing treatment, perforating plays a critical role to the success of the job, though it is often overlooked because perforations are visualized as holes with empty tunnel behind the pipe. Any damage is irrelevant because fracturing will simply bypass the damage. In fact, a shaped charge is made of metal liner and case with explosive loaded in between.
Can a camera do a better job than humans at creating a minute-by-minute record of drilling activity? Or evaluate damaged drill bits better? Based on two papers at the SPE/IADC International Drilling Conference and Exhibition, the answer might soon be yes. Schlumberger researchers showed that video from a camera on the drill floor and on the drilling cuttings shaker kept an accurate record of what the rig was doing at the time, also known as the rig state (SPE 204086). A paper from The University of Texas-Austin (UT-Austin) showed a camera phone could create images for use in their bit-evaluation program to assess the damage and offer a reasonable explanation for what caused it (SPE 204124). A year ago Teradata announced its BitBox, which houses cameras and lights to create 3D bit images.
Schlumberger announced a collaboration with Amazon Web Services (AWS) to deploy domain-centric software, enabled by the service company's Delfi exploration and production software suite, on the cloud with AWS. This collaboration will bring AWS customers to Delfi, which provides access to Schlumberger applications enhanced with artificial intelligence (AI) and high-performance computing from AWS's secure infrastructure. "Our partnership with AWS complements our strategy to further expand access to the Delfi environment so that more customers can benefit from their subsurface data," said Rajeev Sonthalia, president of digital and integration at Schlumberger. Schlumberger said in a press release that using software supported by the Delfi environment and running on AWS will allow customers to run complex models, computer simulations, and analyses in more quickly than has been possible using traditional methods. "With AWS, Schlumberger can leverage the most comprehensive set of cloud services in the world, including AI and machine-learning services that easily integrate with customer applications," said Matt Garman senior vice president of sales and marketing at AWS. "Schlumberger's cloud-based solutions paired with the high performance, scalability, and security of AWS cloud increase efficiencies so customers have more freedom to innovate."
Wang, Haifeng (Schlumberger) | Thiel, Michael (Schlumberger) | Denichou, Jean-Michel (Schlumberger) | Salim, Diogo (Schlumberger) | Leveque, Soazig (Schlumberger) | Wibowo, Vera Krissetiawati (Schlumberger) | Woods, Chris (Woodside Energy Ltd.) | Baker, Darren (Woodside Energy Ltd.)
Abstract Recently the drilling industry has seen many advances in the application of deep directional electromagnetic (EM) measurements for mapping deeper into the reservoir, with the latest one capable of seeing over 250 ft above and below the wellbore providing unprecedented understanding of the reservoir. This measurement technology is now being used to look ahead of bit while drilling, for exploration wells to reduce drilling risks associated with unexpectedly penetrating certain formation. With the increasing complexity of the reservoirs that the industry is targeting, there is more and more quest for expanding the reservoir mapping capability, not just a 1D approach that can only map resistive boundaries on the vertical axis or near vertical axis and assume infinite extend in all other directions, but to enable geoscientists to better steer the well and better understand the reservoir structure and fluid contact in a full three-dimensional context around the wellbore. In this communication, the authors introduce a new solution to this quest for full three-dimensional real-time reservoir mapping. The solution is composed of three parts: a set of new measurements acquired downhole and transmitted to surface in real-time, a new inversion algorithm that is model independent and therefore fit for any reservoir complexity, and a new computing paradigm that make it possible to provide answers in real-time while drilling. The new set of measurements almost doubles the number of well logs that were acquired before and greatly enriches formations evaluation around the wellbore. The new algorithm, different from all previous algorithms, is not confined to any specific forms of models, making it suitable for exploring and finding solutions in complex reservoir settings. Finally, taking advantage of the latest advances in the Cloud computing, turnaround time of the new inversion is improved by over hundred times, thanks to the scalability of the algorithm design and Cloud computing infrastructure. Combining all these together allows to achieve three-dimensional reservoir map, without having to tradeoff between high resolution and depth of investigation. The 3D reservoir map that is generated from multiple transverse 2D inversion slices in real-time, enables timely update of reservoir model as drilling progress for the operator to make informed decisions. This new technology is currently deployed in several locations around the world and in different environments. In this paper, the authors review deployment results, to illustrate the technology, from preparation to real-time execution, and finally to post-job model update. With the ability of mapping in all directions while drilling, this technology opens the door to many applications and will enable the operators to target more complex reservoirs and achieving better geosteering results where 3D mapping and steering are required. In addition to its benefits for real-time operations, the technology also enables the geoscientists to update and calibrate their reservoir models with fine and accurate details, which can further benefit multiple disciplines including drilling, completion, production and reservoir management.
Oliveira, Jansen (REPSOL) | H., Karl Perez (REPSOL) | V., Alejandro Martin (REPSOL) | T., Ricard Fernandez (REPSOL) | N., Teresa Polo (REPSOL) | V., Lorenzo Villalobos (REPSOL) | Dubost, Francois Xavier (Schlumberger) | V., Manuel Lavin (Schlumberger) | Gisolf, Adriaan G. (Schlumberger) | Jackson, Richard R. (Schlumberger) | Edmundson, Simon (Schlumberger) | Dumont, Hadrien (Schlumberger) | E., Hugo Hernandez (Schlumberger) | Espinosa, Javier (Schlumberger)
Abstract Offshore exploration requires the evaluation of hydrocarbon presence, estimation of volumes in place, and flow potential. To this capacity, formation testers are widely used to determine static data such as reservoir fluid gradients and reservoir pressure, obtain fluid samples, and to assess reservoir connectivity. Dynamic data, acquired with interval pressure transient testing and well testing techniques, are used to assess reserves and productivity. However, these evaluation techniques provide dynamic data at different resolution and length scales, and with different environmental footprint, cost, and operational constraints. A new wireline formation testing technique known as deep transient testing (DTT) has been introduced, which combines high-resolution measurements, higher flow rates, and longer test durations to perform transient tests in higher permeability, thicker formation, and at greater depth of investigation than with previous formation testers - without flaring and at a low carbon footprint. The platform combines advanced metrology with extensive automation to generate unique, real-time reservoir insights. Traditionally, pressure transient analysis and well deliverability predictions were produced through an analytical framework. Today, deep transient testing measurements are interpreted, and placed in reservoir context, in real-time by integration with geological and reservoir models. These steps can be performed from any wellsite utilizing cloud-based resources. Products such as reservoir fluid compressibility, saturation pressure, equation of state (EOS) models, well productivity, or minimum connected volumes are integrated in real-time interpretation utilizing numerical analysis. The digital infrastructure enables key reservoir insights to be shared between all stakeholders in a transparent and collaborative environment for both operational control and rapid decision making. This paper presents a case study where the new DTT technique was combined with numerical analysis and real-time integrated workflows to characterize a multilayer reservoir in a recent discovery in deepwater Mexico. During the drawdown phase of the DTT operation, real-time downhole fluid analysis was used to determine the fluid composition, density, viscosity, compressibility, and saturation pressure. These fluid properties were then used to generate and tune an EOS model. Accurate drawdown flow rate measurements and the subsequent pressure transients were combined with the fluid model and geologic model to enable integrated pressure transient history matching. The resulting calibrated numerical model honors the fluid measurements and geologic model and was used to predict the permeability profile, zonal producibility, and the volume of influence of the test.