Layer | Fill | Outline |
---|
Map layers
Theme | Visible | Selectable | Appearance | Zoom Range (now: 0) |
---|
Fill | Stroke |
---|---|
Collaborating Authors
_ The story of artificial lift has long been one of maintaining the status quo or the “if it ain’t broke, don’t fix it” approach, but adaptation and experimentation have been present along the way. From this pairing came innovations like the widely recognized symbol of the oil patch—the pumpjack—and the hidden marvels of technological wizardry that dwell downhole. In this second part of a two-part series on advances in artificial lift, we’ll look at the state of optimization and a trio of techniques and technologies under development or new to the market. No ‘Snowflakes’ For more than a decade the oil and gas industry has worked in from the edges in its quest to solve the puzzle that is the developmental life cycle of a shale reservoir. Each stage in the cycle has been one of adversity, with the drilling and completion stages presenting a host of challenges in the process of unlocking resources from reservoirs thousands of feet deep vertically and laterally long. These challenges continue into the well’s production stage, accelerating the cycle of adaptation and experimentation as crafty production engineers and field service technicians look for solutions to stave off the dreaded decline curve. “Our understanding of shale reservoirs has gone through the roof. In the early days, people were just starting to understand what shale is. Geologists knew the construct of shale, but to produce from it was a new phenomenon,” said Spandan Spandan, partner at McKinsey & Co., adding that the industry’s understanding of well design and construction has also come far. “This essentially allowed us to convert wells from a snowflake—each well optimized for its own conditions—to something that could be mass produced. The manufacturing era of wells was driven by the understanding of wells and by optimizing to the extreme,” he said. Shale producers looking to capitalize on the manufacturing era encouraged operators to select equipment and services based primarily on price, forcing service companies to focus on developing low-cost technology options, according to a 2017 McKinsey report. That focus remains as the world continues to recover from the COVID-19 pandemic along with geopolitical turmoil that has elevated global demand and placed pressure on oil and gas supplies. “We’ve seen a lot more volatility. The war in Ukraine has highlighted how the market is trying to balance affordability of energy, plus security of supply, and emissions. Historically, we had only two factors. Now we are trying to balance all three of them quite actively. And all of this is being done under the umbrella of the energy transition,” said Spandan. New Commercial Model One challenge facing the artificial-lift sector, particularly shale, is the establishment of a commercial model that allows operators to invest in artificial lift without having to make a long-term commitment of capital, Spandan said. “That is something that both the service company and the operator will need to collaborate on to come up with a commercial model that enables that. We’ve seen some movement within the industry to move the capex spend on artificial-lift systems into an opex model,” he said. “But the next step of evolution for the industry is to perhaps have production optimization as a service, one that bundles the sensors, the lift system, specialty chemicals, all of that as a service, that the capex is converted into something that’s opex.”
- North America > United States (0.30)
- Europe > Ukraine (0.24)
- North America > United States > Wyoming > Niobrara Formation (0.99)
- North America > United States > Texas > Permian Basin > Yeso Formation (0.99)
- North America > United States > Texas > Permian Basin > Yates Formation (0.99)
- (25 more...)
Management Unlike what happened during previous oil-price collapses, merger and acquisition (M&A) activity has been limited since prices started to fall in 2014. But the signs are that M&A activity may be building, and oil company management teams should think about which deal strategies they should pursue. The oil-price trend has historically been one of the most important determinants of how value is created in the oil and gas industry, and some M&A strategies that worked in the rising-price environment over the past 15 years may not work in today’s market. This article examines the industry’s M&A performance across cycles back to 1986 and identifies strategies that could help companies create value through the price trough, measured by total returns to shareholders. Most commodities industries are prone to consolidation during the downside of the cycle, when supply surpluses accumulate, prices fall, and competition heats up. The oil and gas industry is no exception (Fig. 1). In the 1998 to 2000 price trough, more than 25 deals greater than USD 1 billion in value were executed in North America alone, including the BP-Amoco, Exxon-Mobil, and Chevron-Texaco megamergers. In total, this wave of deal making amounted to more than USD 350 billion in just over 2 years. It took another decade to match the same amount of deal volume in North American exploration and production (E&P). Oil prices are recovering from a 12-year low in January (below USD 30/bbl) but remain well below the levels of most of the past decade. There are signs of rising M&A activity, even though few deals have been executed so far. Over the past year, bidask spreads have been too wide for deals to proceed. However, this could change, given the increasing signs of vulnerability among weaker players in the market. First, industrywide leverage has risen significantly over the past 3 years, and it is particularly high for independent E&P companies with exposure to US shale production. This group’s leverage has spiked—with debt at nearly 10 times earnings before interest, taxes, depreciation, and amortization—indicating an increasing likelihood of restructuring for the most indebted players. Second, pricing hedges are beginning to come off. As a result, it is possible that there will be oil and gas companies available at distressed prices, either because they are in Chapter 11 (continuing to operate while restructuring their debt) or because their market valuations will sink to such low levels that they could be attractive acquisition candidates, even if the buyer has to reach an agreement with its bondholders as part of the deal.
Summary Hydrocarbon (re‐)development projects need to be evaluated under uncertainty. Forecasting oil and gas production needs to capture the ranges of the multitude of uncertain parameters and their impact on the forecast to maximize the value of the project for the company. Several authors showed, however, that the oil and gas industry has challenges in adequately assessing the distributions of hydrocarbon production forecasts. The methods for forecasting hydrocarbon production developed with digitalization from using analytical solutions to numerical models with an increasing number of gridblocks (“digital twins”) toward ensembles of models covering the uncertainty of the various parameters. Analytical solutions and single numerical models allow calculation of incremental production for a single case. However, neither the uncertainty of the forecasts nor the question in which the distribution of various outcomes the single model is located can be determined. Ensemble‐based forecasts are able to address these questions, but they need to be able to cover a large number of uncertain parameters and the amount of data that is generated accordingly. Theory‐guided data science (TGDS) approaches have recently been used to overcome these challenges. Such approaches make use of the scientific knowledge captured in numerical models to generate a sufficiently large data set to apply data science approaches. These approaches can be combined with economics to determine the desirability of a project for a company (expected utility). Quantitative decision analysis, including a value of information (VoI) calculation, can be done addressing the uncertainty range but also the risk hurdles as required by the decision‐maker (DM). The next step is the development of learning agent systems (agent: autonomous, goal‐directed entity that observes and acts upon an environment) that are able to cope with the large amount of data generated by sensors and to use them for conditioning models to data and use the data in decision analysis. Companies need to address the challenges of data democratization to integrate and use the available data, organizational agility, and the development of data science skills but making sure that the technical skills, which are required for the TGDS approach, are kept.
- North America > United States > Texas (1.00)
- Europe (1.00)
- Asia > Middle East (0.94)
- North America > United States > California (0.93)
- South America > Argentina > Patagonia > Neuquén > Neuquen Basin > Vaca Muerta Shale Formation (0.99)
- North America > United States > Texas > Permian Basin > Yeso Formation (0.99)
- North America > United States > Texas > Permian Basin > Yates Formation (0.99)
- (27 more...)
The oil industry has been negatively impacted by the fluctuating and declining commodity pricing trend. Many industry participants are preparing for a lower-for-longer price environment, with multiple delays and cancellations of Greenfield project developments. Longer-term, how will the industry energize deepwater developments to close the gap between cost and current commodity prices? This manuscript is intended to provide basic background and to set the stage for a panel discussion scheduled for 2.00pm-4.30pm, Wednesday, May 4, 2016 at the conference. Panelists will discuss how breakeven deepwater oil costs have evolved to date and how they are managing development costs going forward. Some of the areas panelists will address are as follows: Reducing supply chain costs Using standardized or proven project designs Improving upside potential by using third-party hosts for floating production systems Forming alliances among service providers to enhance supply chain savings Re-configuring fiscal terms and local content requirements with host governments
- Africa (1.00)
- North America > United States > Texas (0.46)
- North America > United States > Gulf of Mexico > Central GOM > West Gulf Coast Tertiary Basin > Keathley Canyon > Block 919 > Lucius Field (0.89)
- North America > United States > Gulf of Mexico > Central GOM > West Gulf Coast Tertiary Basin > Keathley Canyon > Block 918 > Lucius Field (0.89)
- North America > United States > Gulf of Mexico > Central GOM > West Gulf Coast Tertiary Basin > Keathley Canyon > Block 875 > Lucius Field (0.89)
- (4 more...)
Management The precipitous drop in oil prices is putting higher-cost plays such as deepwater under the microscope. Key questions include: How will the industry energize deepwater developments to close the gap between cost and current commodity prices? What oil price is required to keep deepwater viable over the long term? These topics will undoubtedly be at the top of the mind for oil company executives over the months to come, and were the focus of a panel discussion at the Offshore Technology Conference (OTC) conference titled “Energizing Worldwide Oil and Gas Developments.” Also participating was Kassia Yanosek of McKinsey & Company, which provided break-even cost analyses and historical data on deepwater developments. By reflecting on the past and looking at the current situation, the panelists explored scenarios on how deepwater breakeven prices can be brought down significantly. The Past and Present In the run-up to the oil price drop in late 2014, deepwater had spectacular investment and production growth. However, even then there were signs of increasingly challenging project economics. Global deepwater investment increased from USD 16 billion in 2003 to more than USD 70 billion in 2013 with production more than doubling in that time period to almost 6 million B/D, or 7% of the world’s total oil supply. However, toward the end of this period, there were buildups in costs and cycle times. In the USD 100/bbl price environment of 2012–2013, deepwater breakeven costs for green field projects ranged from USD 70/bbl in the US Gulf of Mexico (GOM) to USD 75/bbl in West Africa, 2-3 times the costs in the previous decade. This was due primarily to three factors: increased geologic complexity (for example, the Paleogene in the GOM, and the pre-salt in Brazil and Angola), increased government take and local content requirements, and project cost escalations beyond supplier margin/ commodity costs (such as increased design complexity.) In today’s “lower-for-longer” price environment, deepwater greenfield project economics are challenged, yielding dramatic cuts in investment. As of early 2016, approximately 35 billion BOE or approximately 6 million BOEPD of deepwater reserves and production, respectively, has been deferred. The good news is that in today’s price environment, project costs have been reduced across the board, primarily driven by supply chain margin compression. In regions with competitive supply chains, such as the GOM, breakeven costs for greenfield developments have decreased 20% on average to USD 50–60/bbl and ultradeepwater rig day rates have fallen 40% from the first quarter of 2014 to the first quarter of 2016.
- Africa (0.89)
- North America > United States (0.88)
- Management > Strategic Planning and Management > Project management (1.00)
- Management > Energy Economics > Market analysis /supply and demand forecasting/pricing (1.00)
- Management > Asset and Portfolio Management (1.00)
- Facilities Design, Construction and Operation > Offshore Facilities and Subsea Systems (1.00)