Fluid flow analytical models traditionally used in well test analysis not always yield adequate results for reservoir characterization. Contrastingly, fractal formulations have proved to be a more suitable tool. Among the various formulations, the Metzler-Glockle-Nonnemacher (MGN) equation includes a fractional time derivative. In this paper, that equation was applied to analyze interference tests in actual field cases. The MGN equation was applied and automatically matched to the observed pressures. Both usual reservoir properties -porosity and permeability- and also fractal exponents were varied in the matching procedure.
Automatic matches of the involved variables resulted in pressure increments that compare fairly well with observed data from the field tests. The responses show a high anisotropy in the rock properties.
The MGN equation implemented in this paper comprises fractal behavior in spatial variables as well as in the time term. Thus, well tests recorded in highly heterogeneous media, not amenable to standard techniques, can be analyzed with such analytical fractal approach.
The Bantayan field, located on the northeast flank of the South Sumatra basin, was delineated by surface geological studies in the late 1920s as a narrow northwest trending anticlinal structure bisected by numerous northeast trending transbasinal normal faults. In 1932, a discovery well encountered productive upper Talang Akar sandstones which yielded 800 BOPD of paraffinic based crude with a pore point of 115 degrees F. Until year of 1982, only 12 delineation wells were drilled, from which 12 potentially productive sandstone units were obtained within a structural area of 27 sq km. Further evaluation from other new 8-well, it was found that this reservoir system is considered a complex system.
In the Bentayan Field, the hydrocarbon oil is deposited in the Talang Akar depositional complex, with the best overall reservoir quality which is relatively thick and well-connected, fine- to coarse-grained, and cross-bedded fluvial sandstones3).
Oil properties of this reservoir currently have oil gravity of 17 oAPI and viscosity of 82.2 cp1) at the reservoir temperature. Based on the above oil properties, Bentayan field is considered as a heavy oil reservoir, and so far only 14% of Bentayan oil reserve has been produced. Since 80 years of the discovery, reservoir pressure of Bentayan field has declined from 1696 psi to about 1035 psi, and oil production has also declined significantly. In order improve the oil production, water injection has been implementing to several wells throughout the field, however the oil production is still declining.
To reduce viscosity of the oil in the reservoir, application of steam huff n puff could not work very well for improving oil production in this field. This could be caused by high reservoir pressure and also by depth of the reservoir. Since this field is considered as a mature field where all production wells have been producing with high water cut, all conventional efforts to improve oil production would be difficult to success. Implementing chemical EOR, in this case, is the best way to increase oil production. However, the selected chemical must be able to lower the oil viscosity, lower IFT and alter the oil wettability. In this case, a special chemical treatment (SEMAR) had been implemented in this field using huff n puff technique. Since, the reservoir is under strong water drive, the injection of SEMAR can also improve oil production of other surrounding wells nearby. These phenomena are very interesting to be evaluated for further steps in fullfield chemical EOR implementation.
Bentayan oil reservoir is considered a heavy oil reservoir with high oil viscosity. Thermal injection method for EOR is rather difficult to be implemented in this field. This is due to depth of the reservoir, high reservoir pressure and small reservoir size. Best alternative for EOR implementation on this field is Surfactant injection. However, surfactant EOR is commonly not able to work in heavy oil and in high temperature reservoir. That is why, for long time, oil production from Bentayan oil field is difficult to improve, even using thermal method.
Costello, Craig (Apache Corp.) | Sordyl, Peter (Apache North Sea Ltd.) | Hughes, Cledwyn Thomas (Apache North Sea Ltd.) | Figueroa, Martin Rafael (Schlumberger) | Balster, Emmanuel Philippe (Schlumberger Oilfield Services) | Brown, George
Permanent distributed temperature sensing (DTS) using fiber-optic technology provides measurements over the complete length of the fiber in the wellbore. The temperature profiles can be monitored at surface in real time, minimizing the need for production logs, preventing deferred production losses, decreasing well interventions, and reducing operating costs.This technology has been applied by Apache North Sea Ltd in the Forties field to monitor and optimize the performance of two wells producing by gas lift in the Delta platform and, at the same time, examine their completion integrity.
To accomplish these objectives, a hybrid fiber-optic electrical cable was installed in two Forties wells, allowing the continuous measurement of temperature and acquisition of pressure data from a downhole gauge located below the deepest gas-injection point. The combined benefit of reducing both the number of well interventions, and thus eliminating the associated QHSE risks, and the operating costs made this well monitoring strategy the appropriate one in this mature field.
The analysis and interpretation of downhole pressure and DTS data provided rapid feedback to the platform production team regarding the status of the well, allowing a better and more informed decision-making process.
In this paper, we outline the deployment of the hybrid DTS system and describe the analysis performed in each of the two wells. Data handling, analysis, and interpretation are described as well as the methodology and workflow for well monitoring and optimization using permanently installed DTS.
The Forties field is the largest oil field in the United Kingdom Continental Shelf (UKCS) and is located 180 km east-northeast of Aberdeen, mostly in the UK Licence Block 21/10 (Fig. 1). The Forties is a mature field that has produced hydrocarbons since September 1975. The field was originally developed by BP but was taken over by Apache North Sea Ltd in 2003. Plateau production was at 500,000 B/D in 1979; production had fallen to about 40,000 B/D by 2003. As a result of the aggressive infill drilling campaign undertaken by Apache since 2003, current production is about 60,000 B/D. After an intensive reevaluation of the field, the company has doubled the proved reserves, thereby extending the field's life by at least 20 years. There are five production platforms with 85 wells online. Forty percent of the existing wells are producing with assistance of deep gas lift, and the remaining 60% are producing with electrical submersible pumps.
The field occupies an area of approximately 90 km2. The reservoir interval occurs in thick Late Paleocene sandstones deposited in sand-rich submarine fan sequences. The general depositional current runs from northwest to southeast and controls the orientation of many features such as channel complexes. The reservoir in general is characterized by good permeability and high net-to-gross with good lateral and vertical connectivity. The reservoir fluid is fairly light with a gravity of about 35° API.
Prior to the broad implementation and utilization of Multiphase Flow Meters (MPFMs) in Saudi Arabian offshore oil fields, rigorous testing was performed utilizing in-series MPFM systems in conjunction with Saudi Aramco's testing fleet that are equipped with a conventional separator testing trap. Sets of data were collected simultaneously from different wells through different MPFMs, completed in different reservoirs, for about a year to compare and validate the results. The accuracy of the MPFM proved to be within engineering acceptable margin of error for all parameters and in most cases matches the conventional methods of testing results. Since then, offshore platforms are being retrofitted with a MPFM to be able to test all the wells on the platform by selectively switching them one at a time through a test line physically and remotely.
Since its pioneer implementation almost a decade ago, MPFM testing accuracy and the technology has improved considerably. The use of the MPFM has many advantages in the testing operations, especially during periods where the demand is high. Accurate and frequent well testing becomes decisive in times of maximum production rates since the results from well tests facilitate determining which wells are under significant decline and which have increasing water cut in a real time fashion especially for fields with large number of wells. The quick identification of these problems leads to taking immediate action to restore the wells' productivity toward maintaining optimal production rates. The MPFM offers real time well performance monitoring through the Supervisory Control and Data Acquisition System (SCADA) and has an added benefit of shorter test rate stabilization times. Additionally, the induction of the remotely operated selector switch in the units recently allowed full automation of the process of switching remotely different wells for well testing on a multi-well platform. It also minimizes human involvement and provides operational flexibility. In addition, it reduces the waiting time for wells' switching due to natural limitation factors, such as bad weather offshore.
This paper largely addresses the reliability and accuracy of MPFMs as compared to a conventional separator and to a portable MPFM as well as using electrical submersible pump (ESP) optimization application to identify what method gives more accurate testing. Further optimization of ESP well's performance with frequent testing is also accomplished, which in turn improves the sweep efficiency of the reservoir, accelerates the production of recoverable reserves and environmentally help improve pump run-life. The paper will also elaborate on the benefits attained from installing MPFM in mature offshore oil fields with a focus on special cases like the MPFM in-series testing, testing artificially lifted wells, smart well completions and newly completed wells. It will illustrate the benefits attained with focus upon special cases like the MPFM in-series testing, artificial lift wells, smart well completions and newly completed wells exhibiting an indirect support in achieving the production targets. Different principle and theory behind MPFMs will be highlighted with advantages along with a vision of the way forward. Further to the benefits of installing MPFM, the paper will also discuss lessons learned and improved guidelines imposed over wellhead sampling (WHS) benefiting from the success of the MPFM. These guidelines effectively minimize sampling in wells equipped with MPFMs.
Every oil company strives to produce its fields smartly, effectively and efficiently to ensure maximum recovery and minimize any unrecovered reserves. Being the largest integrated oil company with the most reserves, Saudi Aramco is in the forefront of such efforts by utilizing real time data and controls, also referred in our industry as smart or intelligent wells and controls. Some of these efforts are evident from the recent development of Haradh III increment (2006); part of Ghawar field, which is considered to be the world's first fully intelligent field1. Even prior to that, individual intelligent field ventures started in many fields and Safaniya, the world's largest offshore oil field, is not new to that. Safaniya holds a wide range of intelligent field equipment and is expected to be fully automated in the near future. Completing wells with smart completion jewelry is one such effort. With an increasing number of smart well completions in Safaniya, the need is to ensure the maximum benefit is gained from them and the completion design is optimized early on.
This paper is an effort to detail the procedure adopted to conduct the first test on smart well completions in one of Saudi Arabia's offshore fields. The paper details the processes and practices of testing smart well completions for inflow performance and capacity of all laterals and the operation of the downhole valves. To evaluate the completion practices in horizontal multilateral wells followed by simulation runs with various completion configurations, the test was run for two different wells with two different completion configurations. This practice helped optimize design completion (tubing size, number of multilaterals) of dual lateral wells with inflow control valves (ICVs) to maximize production.
BP in Norway has been in the forefront of developing intelligent energy solutions since 1999 with the installation of their first fiber optical communications link to the Valhall and Ula fields in the North Sea. Throughout the past decade BP Norway has built up considerable experience with Field of the Future Technology ranging from the first advanced collaboration environments for both drilling and operations, the worlds first "Life of Field Seismic?? array on the Valhall field as well as advanced remote monitoring tools .
In 2005 it was decided to go ahead with two major projects, one a new field center for the Valhall field and the second a green field development for the Skarv field based on an FPSO. This presented a fantastic opportunity to build on BP Norway's existing experience and create two truly second generation fields of the future. This paper will examine how this was achieved and how the project teams and contractors were engaged via a set of automation and remote performance blue prints to deliver this achievement.
As these fields are commissioned and come on stream they will present a major change in maintenance and operational mind set for the onshore support organization in adopting the extensive remote monitoring and optimization tools that are available to them. New advanced collaborative environments have been provided not only at the main office but also at the new support base and offshore. The Valhall design also includes on onshore control room with full remote control capability.
Clearly these two fields represent a sustainable approach to Intelligent Energy - a world leading example of delivering value retrofitting digital oilfield capabilities to existing assets, then incorporating all the experience and learning into a second phase of greenfield projects.
The Norwegian Sector of the North Sea is a mature province where BP Norge, on behalf of its partners, operates three field centers; the Valhall hub, consisting of the Valhall and Hod fields (Partner: Hess Norge AS), the Ula hub, consisting of the Ula and Tambar field (Partner Dong Norge AS) and lastly BP Norge's new Skarv field (Partners Statoil ASA, E. ON Ruhrgas Norge AS and PGNiG Norway AS). BP Norway's main offices and support organization are located in Stavanger on the south west coast of Norway, with a new support base for the Skarv operations in Sandnessjøen in Mid Norway.
In this paper the authors will examine how BP Norge have progressed their Intelligent Energy solutions, known internally as the Field of the Future concept, through the last 12 years achieving a high degree of sustainability. The authors will briefly describe the establishment of BP's first fiber optical communications link to a major offshore installation in 1999 and the adoption of advanced collaborative environments (ACE) and remote performance and condition monitoring technology in the existing brownfield facilities. They will then go on to describe how BP Norge has managed to integrate these concepts into two new major projects, namely, a new production and accommodation platform for Valhall and a new FPSO Vessel for the new Skarv field. The authors will examine how initial lack of experience of how to build and incorporate these technologies was overcome. The authors will also examine the organizational challenges involved in the introduction of such large scale use of these " concepts, even within an organizational unit relatively familiar with this type of technology on their brownfield installations albeit at a smaller scale.
Well-based Surrogate Reservoir Model (SRM) may be classified as a new technology for building proxy models that represent large, complex numerical reservoir simulation models. The well-based SRM has several advantages over traditional proxy models, such as response surfaces or reduced models. These advantages include (1) to develop an SRM one does not need to approximate the existing simulation model, (2) the number of simulation runs required for the development of an SRM is at least an order of magnitude less than traditional proxy models, and (3) above and beyond representing the pressure and production profiles at each well individually, SRM can replicate, with high accuracy, the pressure and saturation changes at each grid block.
Well-based SRM is based on the pattern recognition capabilities of artificial intelligence and data mining (AI&DM) that is also referred to as predictive analytics. During the development process the SRM is trained to learn the principles of fluid flow through porous media as applied to the complexities of the reservoir being modeled. The numerical reservoir simulation model is used for two purposes: (1) to teach the SRM the physics of fluid flow through porous media as applied to the
specific reservoir that is being modeled, and (2) to teach the SRM the complexities of the heterogeneous reservoir represented by the geological model and its impact on the fluid production and pressure changes in the reservoir.
Application of well-based SRM to two offshore fields in Saudi Arabia is demonstrated. The simulation model of these fields includes millions of grid blocks and tens of producing and injection wells. There are four producing layers in these assets that are contributing to production. In this paper we provide the details that is involved in development of the SRM and show the result of matching the production from the all the wells. We also present the validation of the SRM through matching the results of blind simulation runs.
The steps in the development of the SRM includes design of the required simulation runs (usually less than 20 simulation runs are sufficient), identifying the key performance indicators that control the pressure and production in the model, identification of input parameters for the SRM, training and calibration of the SRM and finally validation of the SRM using blind simulation runs.
Steam injection is the primary technique used to produce heavy oil reservoirs. The technology, used either as a continuous injection or cyclic steam, has been successfully applied in Chevron's San Joaquin Valley fields since the mid 1960's. In the Diatomite reservoirs cyclic steam was introduced in the mid 1990's and used profitably ever since to increase recovery from these assets.
In the early times the field practices manually solved the production and steam optimization problem on a daily basis while taking into consideration all the operational constraints. As the number of wells became larger (>500) manually resolving issues became difficult. In 2003 a scheduling tool based on Genetic Algorithm (GA) was successfully deployed and is still being used. As the field has matured, the well's individual production curve behavior has been changing from an idealistic exponential decline to a more complex profile. These new conditions together with other tool limitations prompted the opportunity to develop a new approach that would increase the success of the scheduling tool.
A new approach, analogous to the Type Curves Concept (TCC) for cycle production response is introduced in this article. The methodology uses Neural Networks (NN) to identify different cycle production shape patterns. Data from 500+ wells and more than 12,000 cycles were used to identify the type curves or shape patterns. A meticulous classification of raw data revealed fourteen patterns, which were reduced to four representative patterns for further analysis and modeling. Significant data processing such as ID cycles, interpolation, normalization for amplitude and cycle-time, as well as selection of input parameters, were performed in order to train the Neural Network. To complete the model, heuristics rules were inferred to optimize well candidate selection. The hybrid NN-rule based application was coded and integrated with the scheduler and is currently being field-tested. Preliminary results indicate an 80% success rate while the business value analysis showed significant optimization potential.
The paper outlines the methodology used in the study including patterns identification, model training, rule-based system and concludes with early deployment results. The hybrid system proved to provide better steam allocation and minimize production loss. Lastly, challenges and lessons learned during the development and deployment are summarized.
Technology over the last decade has revolutionized information systems and how data are collected and shared in our daily lives. An extraordinary amount of data is being collected, stored, analyzed, and shared every day. The oil and gas industry is no stranger to data collection and analysis, but data acquisition and management has historically been complex and expensive and, thus, slowly implemented. With recent advances in hardware and software, costs have plummeted to yield rapid implementation of sophisticated monitoring, analysis, and easy sharing of information in oilfield projects.
The thermal-enhanced oil recovery (TEOR) process requires constant monitoring and analytical management of wells and facilities. This data management process requires large amounts of data to be collected, correlated and acted-upon on a very frequent basis, resulting in easily-recognizable business benefits by oil producers. This paper reviews the use of wireless instrumentation and integrated data collection systems for digitally managing oilfields, with one such implementation being reviewed.
An independent oil producer using wireless instrumentation and state-of-the-art data integration technology has beenoperating the largest Wireless HART network in the United States since the beginning of 2010. This independent oil producer is wirelessly monitoring all of their steam injectors in three thermal oil fields located in western Kern County, California.
Information Sharing Boom
Information sharing technology has boomed over the last 20 years with tremendous growth over the last 10 years. The Internet has become the primary infrastructure for information sharing. Internet usage has dramatically increased over the last 20 years; census and World Bank statistics show that in 1990 less than 1% of the United States population used the Internet, a number which skyrocketed to 80% by 2010. Broadband Internet is in the United States has become ubiquitous, and cellular Internet access is now available throughout most of the United States. Cell phone usage has grown exponentially over the last decade. As of 2012, an estimated fifty percent of cell phone users will have smart phones. The continually growing infrastructure of both broadband Internet and cellular data networks is providing easy access to the Internet, and thereby information, from anywhere in the United States. This increased coverage and bandwidth has likewise increased access to information, data generation, and a growth of integration technology. Data generation has exploded in recent years; in 2005 the world generated 150 Exabytes of data. It is estimated that the world generated over 1,200 Exabytes in 2010, and we are on pace to create to create 35,000 Exabyte of data in the year 2020. One Exabyte of data can house the entire library of congress 100,000 times over.
The immense scale of data generation and consumption we are seeing today could not be possible without Internet-based startups creating new technologies and setting precedents on how to provide storage and distribute data to consumers. Online services like Facebook, Twitter, and Skype have revolutionized communication between people via the Internet. Companies like Google and Amazon have also revolutionized methods of storing data, accessing data, and analyzing data. Today data generation and information sharing is no longer a challenge; the challenge is now how to effectively provide the information a person needs where they need it and when they need it. The same technologies that saw enormous growth for the social media world easily translate to the world of business, and it is our responsibility to learn to fully utilize them to modernize the Thermal EOR industry.
Gas or steam gravity drainage is a very efficient recovery mechanism. Field observations, laboratory studies and pore network modeling have pointed towards very low residual oil saturations and high recovery with gravity drainage. While work during the last three decades has focused on understanding the physics of three??phase flow, literature on field scale gravity drainage production decline characteristics is somewhat limited. Understanding field level production decline characteristics is important as it enables better production forecasting, resource estimation and reservoir management.
In this paper production data from two fields under gravity drainage is evaluated and it shows strong exponential decline characteristics. After an initial period of rapid decline, there is a period of lower decline for a long period of time. The paper relates field scale observations of gravity drainage to laboratory observations of gravity drainage in long cores. The laboratory experiments show similar behaviors where a period of initial rapid decline is followed by a long period of lower
decline, resulting in very low remaining oil saturation in the gas invaded zone. The paper discusses physics of gravity drainage displacement and summarizes similarities and differences between field and laboratory observations.