|Theme||Visible||Selectable||Appearance||Zoom Range (now: 0)|
Harvesting vast amounts of data has long been identified as an enabler of operational performance. The measurement of key performance indicators is a routine practice in well construction, but a systematic way of statistically analyzing performance against a large data bank of offset wells is not a common practice. The performance of statistical analysis in real time is even more rare. With the introduction of edge computing, devices capable of complex analytical functions in physical proximity to sensors and operations, this practice can be realized. Two case studies are presented: rate of penetration (ROP) and amount of vibration per run.
Hypothesis testing is a statistical method in which a sampled dataset is compared against an idealized or status quo model. This model is built using many samples from a population. The characteristics of the population are then inferred from these samples. The model is built in centers where large amounts of data are available. These models are then transferred to an edge device in the field. The device collects real-time data and compares results to the status quo model. In the two cases presented, hypothesis testing was used to determine maximum and minimum levels of ROP and downhole vibration. This information is used to determine the effectiveness of new drilling practices, technologies, or methodologies. Because calculations are performed in real time, changes to drilling practices can be adopted quickly.
The ROP case was performed at a US operating unit; the vibration case was performed in a Middle East unit. The models showed what improvement values should be. It was revealing to find wells that were thought to be poor performers were actually well within the population normal. Wells were also found that were thought to be good performers, but where new drilling methods were used, actually fell within the population model and thus suggested that the new methods had not affected performance. By performing this analysis on the edge device, operations can make changes early in such a way that results fall outside the status quo model and deliver real performance improvements.
The paper presents the novel use of statistical models calculated in data centers in conjunction with real-time operations. Similar approaches in technical and physics modeling exist in which models are produced in the office and used in the field. However, building models for operations management, from a large bank of offset data, and performing analysis in the field with real-time data is a not common practice. This paper shows both technology and statistical methods that provide a valid scientific framework for operational performance improvement.
Gupta, Dhaval (Reliance Industries) | Paliwal, Aditya (Reliance Industries) | Jain, Himanshu (Reliance Industries) | Gupta, Vinay Kumar (Oil & Natural Gases) | Goyal, Atin (Reliance Industries) | Kumar, Sujeet (Reliance Industries) | Bose, Soham (Reliance Industries)
The Reliance Group is India's largest private sector enterprise with business in energy & materials value chain. With over 24,000 employees & 100,000 contract workers, RIL believes that a healthy employee and a healthy workplace is the strongest pillar for sustainable growth of any organisation. The management believes that driving and sustaining behavioural changes; by adopting best health practises at workplace, create sense of ownership & bring hazard free environment. To inculcate best practises in field of occupational health and safety,an innovative idea was introduced known as CASHe-Change Agents for Safety, Health and Workplace Environment.This innovative project with pro active preventive approach comprising team work of medical,safety, environment & technical professionals have established a landmark in field of occupational health resulting in hazard free environment and a positive change in work attitude of employees and contract workers from grass root to top level. CASHe projects have lead to manufacturing divisions consuming 0.48 million GJ of renewable energy (wind and biomass),recycled material use of 63,220 tonnes,recycled water as a percentage of total water consumed to be 55.05%. There has been 2.5% increase in renewable energy consumption, 3.1% reduction of direct GHG emission,4.5% reduction of air emission (NOx),4.1% increase in materials recycled, 11.7% reduction of air emission (SOx) (all compared to yr 2014-15), creating a milestone in the year 2015-16.CASHe has made employees realise that they are not just "employees" but doctors of change. It has set a benchmark for HSE management system, its time this model is used across companies to leverage their safety standards, enhance efficiency & make workplace hazard free. This paper studies how CASHe came into existence, its purpose, its implementation and results in 13 sites of Reliance Industries with emphasis of innovative projects that lead to its success.
Bakulin, Andrey (Geophysics Technology, EXPEC Advanced Research Center, Saudi Aramco) | Jervis, Mike (Geophysics Technology, EXPEC Advanced Research Center, Saudi Aramco) | Colombo, Daniele (Geophysics Technology, EXPEC Advanced Research Center, Saudi Aramco) | Tsingas, Constantine (Geophysics Technology, EXPEC Advanced Research Center, Saudi Aramco) | Luo, Yi (Geophysics Technology, EXPEC Advanced Research Center, Saudi Aramco)
Surface geophysics has good coverage, but is limited in vertical resolution and quality, especially in areas with complex overburden. To realize high-fidelity reservoir characterization and monitoring, we foresee the need to bring geophysics closer to the reservoir to transfer geophysical measurements more in line with the resolution obtainable by logs. Sensors, and probably sources, need to be deployed tens or hundreds of meters below the surface. We envision this happening for targeted applications in areas from 10 to 100 km2.
Setting the stage
Reservoir engineers need geophysics to fill in the gap in information between the wells to help with reservoir management and improve recovery. This translates into a need for ultra-high vertical resolution for reliable reservoir properties, and monitoring. The realities of surface geophysics in the Middle East and many other areas with complex overburden are rather unappealing. Surface seismic on the Arabian peninsula can provide structural information with perhaps 50-100 ft vertical resolution. Mature areas are covered by multiple legacy surveys each escalating in cost, but from an engineer’s perspective with little added value. Engineers build billion-cell reservoir simulation models with 25x25 m grids which are not routinely populated using seismic data. As a geophysicists, we never stop trying to improve data fidelity and quality and we have had some successes using higher channel counts and wide azimuth acquisition, but far too often what we extract is only incrementally better than legacy data. A revolution is needed in geophysical data acquisition to achieve our goals. As we discovered in 4D trials with buried receivers (Bakulin et al., 2012) we have the beginnings of a solution. All of us are familiar with the concept of resolution and associated trade-offs (Figure 1): to see big picture – one needs to be high above the target and to see details one needs to be close. We have largely concentrated on these two extremes for a very long time; we perfected our surface geophysics to image large volumes but at low resolution and we use logging to see incredible details but very locally. Yet developments in these two areas did not fulfill the engineers needs that we outlined earlier. We think this can be changed by bringing geophysics closer to the reservoir, i.e. by literally burying sensors and perhaps sources below near-surface complexities, in deeper boreholes and in producing boreholes.
This paper presents the impact of using statistical hypothesis testing and data analytics to evaluate the performance of a wet gas multiphase flow meter (WGMPFM) against a test separator reference measurement. This study is part of a field evaluation in a wet gas operational environment. The outcome determines whether to approve or reject the WGMPFM for permanent installations and well testing.
The methodology focuses on evaluating and ensuring the reliability of the test separator measurement, given that it is used as the reference against which the WGMPFM is compared for performance. The procedure goes through the use of exploratory data analytics for raw data consolidation, flow stability analysis and data validation. This includes quadratic order linear regression curve fitting of wellhead pressure-flow rates relationship. Furthermore, WGMPFM repeatability evaluation is explained for pseudostable flow in field conditions.
In this study, it is shown that using only raw data, a WGMPFM fails a field evaluation against a test separator. However, after employing data analytics for validation of test separator reference values and after using statistical hypothesis testing for evaluating the repeatability of the WGMPFM, it actually performs within specifications.
This paper provides a robust methodology for multi-phase flow meter performance evaluation in the field. The focus resides on measurement accuracy, reliability and repeatability, through the use of exploratory data analytics and statistical hypothesis testing. The robustness and reliability of reference measurement data in the field is critical, given that flow conditions and stability cannot be controlled as good as in laboratory flow loop tests.
Test separators are the well testing conventional measurement systems, for which consistent industry standards have been developed to ensure their reliability. However, the flow measurement accuracy and reliability of this equipment is highly sensitive to several factors including: well stability conditions, proper separation control, measurement devices calibration and factor determination, and human intervention.
Nevertheless, it is common that the validity of the test separator data is not questioned after it was acquired for comparison purposes. This happens because it is assumed that the procedures followed in the field to ensure the stability and calibration of the reference measurement are comprehensive enough to make the process and metering instrumentation uncertainty negligible. However, in the field, a well cannot be considered as a controllable environment in which the process is in full steady state. A well usually flow under pseudostable conditions, where the process actually fluctuates several percentage points in terms of temperature, pressure, and flow rates. Also, the conventional test separators do not guarantee a near-perfect separation and complete process stability inside the vessel. The meters are calibrated against water and adjusted through rounded factoring values, which lead to uncertainty propagation in the reference. Hence, a test separator cannot be deemed as an unquestionable reference.
This article, written by JPT Technology Editor Chris Carpenter, contains highlights of paper SPE 171510, “Cluster Liquefied Natural Gas: New Paradigm for Small and Medium Liquefied-Natural-Gas Business,” by JungHan Lee, LNG Solutions, prepared for the 2014 SPE Asia Pacific Oil and Gas Conference and Exhibition, Adelaide, Australia, 14–16 October. The paper has not been peer reviewed.
Small- and medium-scale liquefied natural gas (LNG) is different from conventional LNG in trading distances, target markets, and application areas. Small- and medium-scale LNG may better coordinate needs between regional gas producers and consumers. Cluster LNG is a new concept of LNG technology suitable for emerging market environments. High performance of cluster LNG originates from higher liquefaction temperature and the adoption of efficient refrigerants for the temperature ranges. The inherent high performance of cluster LNG enables low capital expenditure (CAPEX) and low operational expenditure (OPEX).
Small-scale LNG has so far constituted only a minor portion of global LNG production. However, there are many necessities of regional energy infrastructures and technologies that are different from those of traditional LNG. In Southeast Asian countries, natural gas and LNG sometimes compete with diesel oil rather than large-scale pipeline gas or large-scale LNG. At the same time, there are many small scattered gas sources remaining to be monetized for domestic markets in the region. Nevertheless, these domestic gas fields have not been developed properly so far because of small units of production and, as a result, high cost per unit. In this regard, new technologies suitable for small- and medium-scale LNG development may be necessary.
LNG for global exports and LNG intended for regional demands may need different approaches. Typical large-scale LNG aims to export at a radius up to 10 000 km; on the other hand, small- and medium-scale LNG may need only a 500- to 2 000-km radius.
Small-scale LNG has so far constituted only a minor portion of global LNG production. However, there are many necessities of regional energy infrastructures and technologies that are different from those of traditional LNG. In Southeast Asian countries, natural gas and LNG sometimes compete with diesel oil rather than large-scale pipeline gas or large-scale LNG. At the same time, there are many small scattered gas sources remaining to be monetized for domestic markets in the region. Nevertheless, these domestic gas fields have not been developed properly so far because of small units of production and, as a result, high cost per unit.
The Offshore Protocol of the Barcelona Convention for the Protection of the Mediterranean Sea entered into force in 2011. In addition, since 2007, the European Union (EU) has adopted many measures impacting the upstream Oil and Gas (O&G) industry. To list a few: REACH (Registration Evaluation Authorization Chemicals), its Marine Strategy Framework Directive (MSFD - 2008), the Offshore Safety Directive (2013), the Marine Spatial Planning Directive (MSP - 2014), etc. It also revised its Environmental Impact Assessment Directive (EIA - 2014) and is in the process of revising its Environmental Liability Directive (ELD).
This unremitting legislative process slowly but deeply impacts the O&G industry. On the forefront, what is at stake is the access to new prospects and planned developments. But this also impacts operators of existing facilities, who have to adapt to new constraints. In many countries bordering the Mediterranean Sea the administrative processes have changed – or need to change – dramatically in order to cope with the new requirements and that might increase the bureaucracy or generate delays. But the changes also provide opportunities for the industry to improve their overall Health, Safety and Environment performance and demonstrate their commitment towards sustainable development, in a safe manner and respectful of the environment, taking into account the other legitimate users of the sea.
In seismic inversion, we tend to use the geometrical behavior of the wavefield (the kinematics), extracted from the data, to constrain the long wavelength model components and use the recorded reactions to invert for the short wavelength features in a process referred to as full waveform inversion (FWI). For such a recipe, single frequency (the right frequency) data are capable of providing the ingredients for both model components. A frequency that provides model wavelengths (through the transmission components) low enough to update the background and high enough (reactions) to map the scattering may render the other frequencies almost obsolete, especially large offset data are available to provide the transition from background to scattering components. Thus, I outline a scenario in which we acquire dedicated mono frequency data, allowing for more time to inject more of that single frequency energy at a reduced cost. The cost savings can be utilized to acquire larger offsets, which is an important for constraining the background model. Combing this single frequency data with a hierarchical scattering angle alter strategy in FWI, and potentially reaction FWI, provides an opportunity to invert for complex models starting even with poor initial velocity models. The objective of this new paradigm is a high resolution model of the Earth to replace our focus on the image, which requires a band of frequencies.
Small and medium scale LNG is different from conventional LNG in trading distances, target markets, and application areas. Small and medium scale LNG may liaise needs between regional gas producers and consumers. Restrictions and differences of the regional energy tasks may call for different approaches in technology and business model. Cluster LNG is new concept LNG technology suitable for the emerging market environments. High performance of Cluster LNG is originated from higher liquefaction temperature and the adoption of efficient refrigerants for the temperature ranges. The inherent high performance of Cluster LNG enables low CAPEX and low OPEX. Despite of increased containment system cost over Conventional one, cost reduction in the Cluster LNG liquefaction system prevails overall economics within 3,000 ~ 5,000 Km distance from production to consumer sites. The high performance of the Cluster LNG brings competitiveness in small and medium scale LNG business requiring 75 ~ 85% of CAPEX and 65 ~ 75% OPEX compared with Conventional one. In addition, the CO2 emission per unit LNG production for Cluster LNG is 60 ~ 70% of the Conventional LNG. The Cluster LNG technology and business is well matched for small and medium LNG business with cost-effectiveness and flexibility. After extensive engineering verifications and test plant operations, several commercial Cluster LNG projects are under negotiation stages. The Cluster LNG technology also increases local contents with wider choices of equipment vendors and system providers together with less project budgets. Flexibility and cost competitiveness of the Cluster LNG would bring paradigm change in small and medium LNG business.
Energy is basic necessity like agricultural products for human beings and indispensible for individual wellbeing. Meeting increasing electricity and energy demands for industries and households has become national agendum for some of the Southeast Asian countries. Among other energy resources, natural gas and LNG take important roles with relative abundance and environment-friendly benefit. However, natural gas and LNG are difficult to be moved from production to consumer sites with large volume and handling difficulties. The limitations have prevented remote area small quantity users from advantaging the cost-effective energy.
Traditionally, LNG has been developed targeting large volume of production and sales to achieve economy of scale. Consequently, small scale LNG has been only minor portion in the global LNG production. However, there are many necessities for regional energy infrastructures and technologies which are different from those of traditional LNG. In Southeast Asian countries, natural gas and LNG sometimes compete with diesel oil rather than large scale pipeline gas or LNG. At the same time, there are many small scattered gas sources to be monetized for domestic markets in the region.
LNG for global exports and the one for regional demands may need different approaches. Typical large scale LNG aims to export up to 10,000 Km radius, on the other hand, small and medium scale LNG may need only 500 ~ 2,000 Km boundary. Refer to Figure 1.1 for the radius difference between large and small and medium scale LNG. Therefore, business models and technologies for global LNG and the ones for regional LNG should be different from each other.