A multi-phase stimulation treatment was required and subsequently executed in deep-water Gulf of Mexico to remediate a multitude of damage mechanisms resulting from years of hydrocarbon production. Among the many challenges that deep-water operators must face, there is the need for remediation of wells experiencing a decline in production. The execution of these treatments can prove to be very costly and require extensive damage assessments to properly design the most effective stimulation plan. Treatment placement is a major part of the decision process and will impact the performance of the job. A well in the Mississippi Canyon field had an asphaltene deposition issue based on asphaltene onset pressure evaluations as well as suspected fines migration issues. Each requiring its own treatment protocol. This operation required that a rig be moved onto location so that the job could be pumped via coiled tubing to assure injectivity into the zone of interest.
A multiphase approach design included:
The challenge is the difference between utilizing xylene alone for organic deposition removal verses specialty solvent treatments specific to asphaltene removal as well as the use of deep penetrating hydrofluoric acid blends and specialty additive packages.
Utilizing this multi-phase approach resulted in a successful treatment outcome for the operator. An increase in total fluids production, an increase in flowing tubing and a job pay off of less than 30 days was the result of finding a solution to these particular set of challenges.
At the present time, more than 9,000 offshore platforms are in service worldwide, operating in water depths ranging from 10 ft to greater than 5,000 ft. Topside payloads range from 5 to 50,000 tons, producing oil, gas, or both. A vast array of production systems is available today (see Figure 1). The concepts range from fixed platforms to subsea compliant and floating systems. In 1859, Col. Edwin Drake drilled and completed the first known oil well near a small town in Pennsylvania, U.S.A.
The majority of offshore fields have been developed with conventional fixed steel platforms. One common feature of fixed steel structures is that it is essentially "fixed" (i.e., it acts as a cantilever fixed at the seabed). This forces the natural period to be less than that of the damaging significant wave energy, which lies in the 8- to 20-second band. As the water depth increases, these structures begin to become more flexible, and the natural period increases and approaches that of the waves. The consequence of this is the structure becomes dynamically responsive, and fatigue becomes a paramount consideration.
Figure 1.6--The Baldpate Compliant Tower is one of the tallest free-standing structures in the world – Empire State Building (right) for comparison (Web Photograph, Amerada Hess Corp., New York City). Figure 1.9a--Worldwide fleet of installed and sanctioned semisubmersible FPS (courtesy of BP). Figure 1.9c--Worldwide fleet of installed and sanctioned spars (courtesy of BP). Figure 1.10--Semisubmersible FPS planned for the Thunder Horse field (courtesy of BP). Figure 1.11--Alternative proven technology field development options (courtesy of BP). Figure 1.12--Subsea production trees used in conjunction with a fixed jacket structure (Intec Engineering, Houston).
Digital Transformation across the geosciences and engineering domains is driven by the need to share data of increasing complexity and quantity as part of a distributed workload involving operators, partners, service companies and consultants. Unconventional plays pose an additional challenge since a large number of wells need to be planned, to be drilled in days. Companies operating multiple rigs, fracking crews and workover teams have a significant, time-constrained workload that requires rigorous and efficient execution.
To support a broadening of collaboration among all actors involved, such a work context would require all users to operate the same software suite, with the ability to periodically synchronize data repositories or work concurrently in a common cloud environment. However, the reality is that different companies or departments within a company use different software systems. The push to apply more analytics, new modeling technologies or broader, multi-disciplinary workflows to unconventional reservoirs, with software tools often sourced from new third-party providers, adds to the burden of more complex data flows and the concomitant data verification required from data consumers.
File formats exist to transfer specific data objects, for well data (e.g. LAS, DLIS), grid data, etc.. between different software platforms. These formats package the information for one data instance such as one well. With large unconventional resource play projects drilling hundreds of wells a year while managing production from hundreds more, individually writing and reading files for all new or updated wells and sharing this data across multiple systems is labor-intensive and prone to errors or omissions. This in turn puts the burden on the recipient of the data to verify data correctness and completeness.
An industry consortium has developed and published a data exchange standard that packages all the data pertaining to a hydrocarbon reservoir system. This includes subsurface measurements, interpretation, drilling, completion and production data such that a complete project can be exported from one system and read into another in a single pass. The process supports a full charting of data items and their metadata. This drastically reduces the workload at both ends of the transaction while offering significantly better guarantees that the set of data is complete and correctly referenced.
This has been proven in a full-scale live demonstration involving 6 different software systems each executing a step in a reservoir model enrichment workflow and writing out the data ready for the next software system to ingest it. The total time including the workflow steps and read-write operations totaled 45 minutes, significantly less time than would be needed to execute such a workflow using traditional I/O processes and file formats, and at considerably less risk of errors or omissions.
Zhu, Jingyi (Southwest Petroleum University / The University of Texas at Austin) | Yang, Zhaozhong (Southwest Petroleum University) | Li, Xiaogang (Southwest Petroleum University) | Li, Yang (Southwest Petroleum University)
To investigate the feasibility and advantages of microwave heating method in the development of coalbed methane (CBM), three different microwave heating parameters that were heating time, output power and water content were applied. Firstly, the temperature of coal samples under microwave filed was recorded. Secondly, the increments of porosity and permeability of coals before and after microwave treatment were measured by gas displacement method. Moreover, nitrogen adsorption/desorption method and Frenkel-Halsey-Hill model were used to measure the pore structure and fractal dimension of coal samples, respectively. As last, methane adsorption capacity at different temperature and water content in coal seams was simulated. Under microwave irradiation within 30 seconds, the surface temperature of coal samples endured a significant increase. Fractures on the surface of coal cores were created due to thermal stress accompanying with higher temperature area. The enhancement degree of porosity and permeability was correlated with microwave heating time and output power. High water content under microwave irradiation even leaded to the collapse of coal cores. Since methane is mainly stored in the micropores of coal matrix, it’s necessary to focus on the pore structure of coals. After microwave treatment, average pore diameter and cumulative pore volume of coal samples were significantly improved. Moreover, fractal dimension representing complexity of pore structure also increased, resulting from the internal pressure caused by steam. Simulation results showed that methane adsorption capacity in coals decreased with increasing temperature and water content. As a result, such a high increase in temperature within a short time definitely accelerated methane desorption. This study shows that microwave heating will have a broad prospect in the development of unconventional resources.
Oil production from shale reservoirs has increased dramatically in the recent years. To identify drilling targets and optimize well completions, it is important to get early access to reservoir fluid properties. However, due to the low permeability of shale reservoirs, fluid samples often become available only after most important development decisions have been made. Therefore, it has been an abiding challenge in the industry how to acquire fluid properties data earlier in shale reservoirs.
Mud logging gas data acquired while drilling provide the earliest hydrocarbon response from the reservoir. In an earlier study, we have demonstrated that advanced mud gas data have large potential to predict reservoir fluid properties. In general, fluid properties are strongly correlated with thermal maturity of the source rock. In shale reservoirs, reservoir fluids are still in the source rock, as low permeability limits migration and convection of the reservoir fluids. As a result, the reservoir fluid systems in shale reservoirs are relatively undisturbed and have a high degree of consistency. This provides the possibility to correlate advanced mud logging gas data and reservoir fluid properties.
Based on a reservoir fluid database with more than 60 samples from different shale reservoirs, we developed a machine learning algorithm to predict fluid properties from advanced mud logging gas data. The accuracy of the new method is significantly improved compared with the previous model which used an explicit correlation based on wetness. In addition, the new approach is more general and does not depend on a specific shale reservoir. We applied the new model to 11 wells with advanced mud logging gas data. The predicted gas oil ratios are close to the measurement from early production data when advanced mud logging gas data are of good quality.
This publication demonstrates that advanced mud logging gas data can be used to acquire reservoir fluid properties in shale reservoirs. Such approach provides a novel and cost-efficient solution for the sampling challenges in early phase. In addition, the method provides continuous fluid data along entire well, as opposed to a single fluid sample taken at a specific location. Hence the results provide insight in the fluid distribution in shale reservoirs. The method can be widely used for sweet spot identification and optimizing fracking strategy in shale reservoirs.
Pineda, Wilson (BP) | Wadsworth, Jennifer (BP) | Halverson, Dann (BP) | Mathers, Genevive (BP) | Cedillo, Gerardo (BP) | Maeso, Carlos (Schlumberger) | Maggs, David (Schlumberger) | Watcharophat, Hathairat (Schlumberger) | Xu, Weixin (Wayne) (Schlumberger)
Deepwater depositional environments in the Gulf of Mexico can be very complex. Accurate determination of depositional facies is important in these capital-intensive fields. The most common reservoir facies are laterally extensive sheet sandstones with thin mudrock layers, channel complexes (isolated or amalgamated) and channel-levee complexes (often with poor reservoir communication). Reservoirs are often complicated by steep dips close to salt domes and the presence of potential fluid conduits due to faults or fractures. Borehole images aid in determining the character of the sediments, as well as improve net sand calculations, and illuminate the geology in the near wellbore region both in structure and depositional environment, and to provide valuable geomechanics information for the determination of the stress vector.
A well was recently drilled through one of these deep water sediment sequences in the Gulf of Mexico with an oil-based mud (OBM) system. An extensive acquisition program included a series of logging while drilling (LWD) and wireline images. In addition to the current LWD lower resolution borehole imaging tools, a new LWD dual physics OBM imager was deployed for the first time in this field. Five different types of physics were acquired, including lower-resolution images from nuclear measurements (gamma ray, density and photoelectric) and the high-resolution images from dualphysics OBM imager (DPOI) which is based on resistivity and ultrasonic measurements. Wireline high-resolution OBM resistivity images were also acquired. This paper shows a comparison of images collected with the new DPOI versus traditional LWD images and high-resolution wireline resistivity images.
Comparisons of the types of features observed from the various imaging tools were made, showing how the differences in physics, resolution and time of logging affects the images, as well as the impact these factors can have on subsequent interpretations. Four main categories of features are included in comparisons between the tools: sand-rich sections, consistently dipping mudrocks, chaotic zones and fractures/faults. The different images allow fuller interpretation of the gross sequence. In general, the higher the resolution, the more detailed and confident the interpretation is, particularly where the hole conditions are good. In degraded borehole sections, the LWD acquisition was beneficial for obtaining images as early as possible, when damage was at a minimum. The impact of the differences in the physics depends on the properties and contrasts being imaged. This is observed with fractures - both conductive and resistive examples can be seen on both LWD and wireline images. The ultrasonic images are complementary with both low and high amplitude fractures seen, providing more confidence in the fracture interpretation.
Time-lapse seismic monitoring is a powerful technique for reservoir management and the optimization of hydrocarbon recovery. In time-lapse seismic datasets, the difference in seismic properties across different vintages enables the detection of spatio-temporal changes in saturated properties and structure induced by production. The main objectives are (1) to identify bypass pay zones in time-lapse seismic data for the deepwater Amberjack field, located in the Gulf of Mexico, (2) confirm the identified bypass pay zones in the results of reservoir simulation, and (3) recommend well planning strategies to exploit these bypassed resources.
A high-fidelity seismic-to-simulation 4D workflow that incorporates seismic, petrophysics, petrophysical property modeling, and reservoir simulation was employed, which leveraged cross-discipline interaction, interpretation, and integration to extend asset management capabilities. The workflow addresses geology (well log interpretation and framework development), geophysics (seismic interpretation, velocity modeling, and seismic inversion), and petrophysical property modeling (earth models and co-located co-simulation of petrophysical properties with P-impedance from seismic inversion). An embedded petro-elastic model (PEM) in the reservoir simulator is then used to affiliate spatial dry rock properties with saturation properties to compute dynamic elastic properties, which can be related to multi-vintage P-impedance from time-lapse seismic inversion. In the absence of the requisite dry rock properties for the PEM, a small data engine is used to determine these absent properties using metaheuristic optimization techniques. Specifically, two particle swarm optimization (PSO) applications, including an exterior penalty function (EPF), are modified resulting in the development of nested and average methods, respectively. These methods simultaneously calculate the missing rock parameters (dry rock bulk modulus, shear modulus, and density) necessary for dynamic, embedded P-impedance calculation in the history-constrained reservoir simulation results. Afterward, a graphic-enabled method was devised to appropriately classify the threshold to discriminate non-reservoir (including bypassed pay) and reservoir from the P-impedance difference. Its results are compared to unsupervised learning (k-means clustering and hierarchical clustering). From seismic data, one can identify bypassed pay locations, which are confirmed from reservoir simulation after conducting a seismic-driven history match. Finally, infill wells are planned, and then modeled in the reservoir simulator.