|Theme||Visible||Selectable||Appearance||Zoom Range (now: 0)|
ABSTRACT The industry is facing significant challenges due to the recent downturn in oil prices, particularly for the development of tight reservoirs. It is more critical than ever to 1) identify the sweet spots with less uncertainty and 2) optimize the completion-design parameters. The overall objective of this study is to quantify and compare the effects of reservoir quality and completion intensity on well productivity. We developed a supervised fuzzy clustering (SFC) algorithm to rank reservoir quality and completion intensity, and analyze their relative impacts on wells' productivity. We collected reservoir properties and completion-design parameters of 1,784 horizontal oil and gas wells completed in the Western Canadian Sedimentary Basin. Then, we used SFC to classify 1) reservoir quality represented by porosity, hydrocarbon saturation, net pay thickness and initial reservoir pressure; and 2) completion-design intensity represented by proppant concentration, number of stages and injected water volume per stage. Finally, we investigated the relative impacts of reservoir quality and completion intensity on wells' productivity in terms of first year cumulative barrel of oil equivalent (BOE). The results show that in low-quality reservoirs, wells' productivity follows reservoir quality. However, in high-quality reservoirs, the role of completion-design becomes significant, and the productivity can be deterred by inefficient completion design. The results suggest that in low-quality reservoirs, the productivity can be enhanced with less intense completion design, while in high-quality reservoirs, a more intense completion significantly enhances the productivity. Keywords Reservoir quality; completion intensity; supervised fuzzy clustering, approximate reasoning,tight reservoirs development
Over recent decades, sophisticated industrial control systems (ICS) have benefited offshore vessels, but the benefits have been at the expense of operational simplicity and transparency. Continuing questions about how system integrity is best managed raises important issues about the nature, measurement, management, and perhaps even basic concepts of system integrity in the maritime industry. Traditional definitions, concepts, and measurement tools do not adequately support the specialized integrity management needs of the maritime industry. The industry needs practical tools that are readily understandable, supported by a meaningful measurement technique, minimally disruptive to the current data and integrity management workflow, and sufficiently useful so as to endure and evolve the industry's knowledge about integrity in complex systems. This paper explores a structure for a metric supported by a process to systematically evolve the maritime industry's knowledge about integrity in complex systems. The structure introduces a vital concept for system integrity management instantiated as a "corruption vector" that has dimensions useful for gauging the relative impact of integrity-reducing practices and incidents, while supporting the determination and mitigation of the root causes of those incidents. This paper also contemplates the implementation of an analytical method based on, (a) commonly accepted and well-understood software engineering models; (b) commonly collected information for describing the dimensions of a corruption vector; and (c) a transparent and practical decision support tool for quantifying, gauging, trending, and ultimately managing ICS integrity. New integrity management concepts, tools, and processes are needed to clarify the nature of ICS integrity and provide actionable guidance to system managers. Utilization of a structure that embraces the concept of defining, gauging, and eliminating corruption vectors from complex systems is a leap forward in ICS integrity management.
Summary In studies on risk and uncertainty analysis, practical applications for strategizing the execution of exploitation plans are limited. In the standard risk analysis, the critical probability values are P10, P50, and P90 of the OOIP, Reserves etc. (called objective variables). The choice of the critical probability values, P10 & P90, is arbitrary and in general overestimates or underestimates the risk and the opportunity size. The present paper identifies the critical probability values based upon the range of influence of the parameters such as area, thickness etc. (called intrinsic parameters) on the opportunity. Opportunity is the chance to obtain more than the proven value of the objective variables (OOIP, Reserves etc) under consideration.
Abstract We applied Experimental Design methodologies to evaluate development options in a Siberian field with major uncertainties in the geological, reservoir and financial parameters. Our approach, used for the first time in Russia, demonstrates an inverted 7-spot injection pattern is expected to yield 11% more oil than a line drive - identified by conventional evaluation methods. Of more significance, for decision making, the analysis predicts that the expected net present value (NPV) of this pattern is 46% more than the pattern proposed previously. The primary reservoirs consist of stacked fluvial, delta distributary channel and mouth bar sands. Although exploration tests confirmed oil rates exceeding 120 tonne/day, the location and orientation of the individual sand bodies could not be mapped from existing seismic. Recognising this was a risk to project success, we developed an Experimental Design technique for quantifying the uncertainties in oil rate predictions. The major stages were:Identification of the 20 geological and reservoir parameters that influence the simulation production forecasts. Examples include the dimensions of the sand bodies, residual oil saturation, facies distributions, amongst others. Determination which parameters have a major impact on oil rate predictions. In principle, this requires simulating all combinations of possible values for the 20 parameters. In practice, this is not realistic; hundreds of thousands of dynamic models would be needed. With the Plackett-Burman Experimental Design, we only needed 48 reservoir descriptions - each consistent with the available field data - to identify the 11 significant subsurface parameters. Creation of a second order polynomial equation that expressed the net present value (NPV) as a function of the 15 most significant reservoir and financial parameters. This equation, intended to reproduce the response of the dynamic simulator and the financial model, gave a probabilistic distribution of NPVs for each development pattern. For the 6 patterns under consideration, we created the equation with a Composite Face Centred Experimental Design technique. This required a total of 378 simulation cases. Results from these cases were obtained in 3 days with the use of parallel processing technology that allowed submitting multiple models simultaneously. Our experience shows that Experimental Design techniques integrated with advanced computer technology can cut evaluation times from months to weeks. More importantly, they give unbiased probabilities of production profiles and NPV so that downside risks can be quantified.
This paper was prepared for presentation at the 1998 SPE Asia Pacific Conference on Integrated Modeling for Asset Management held in Kuala Lumpur, Malaysia, 23-24 March 1998.