Layer | Fill | Outline |
---|
Map layers
Theme | Visible | Selectable | Appearance | Zoom Range (now: 0) |
---|
Fill | Stroke |
---|---|
Collaborating Authors
Results
ABSTRACT This paper presents a numerical optimization approach applicable to heat transfer design. A novel design procedure for the prediction of heat transfer inside a pressure vessel of an ocean current turbine using the finite element method of heat transfer analysis, artificial neural networks and genetic algorithms is presented. Numerical heat transfer analysis was done using commercial software ANSYS for two-dimensional heat transfer in simplified domains. Computation was confined to heat conduction. The ANSYS simulations results were then used for training and approximating the unknown functional behavior of heat transfer by using artificial neural networks (ANN). The trained ANN serves as the nonlinear objective function of the optimization procedure. Genetic algorithms (GA) were finally employed as the optimization tool. The optimum results obtained from the GA were verified against both ANSYS and ANN results. Both the ANN and GA were implemented in MATLAB environment. The overall methodology application was in effect validated by the results satisfactory for a specific ocean current turbine application. INTRODUCTION The 20KW experimental ocean current research turbine developed by the Southeast National Marine Renewable Energy Center (SNMREC) at Florida Atlantic University (FAU) is designed to generate electricity from passing flow by a three-blade rotor connected to an induction electric motor/generator by a shaft supported by needle bearings and planetary gear reduction box. Ocean turbines are a new technology in the alternative energy community and the SNMREC is assisting commercial developers to test components and subsystems with this turbine (Driscoll et al., 2008). One of the important parts of the ocean turbine is the pressure vessel inside which the electric motor along with other electric components and sensors for controlling and monitoring purposes are installed. When the electric motor runs at full load, it generates a significant amount of energy in the form of heat.
- Information Technology > Artificial Intelligence > Representation & Reasoning > Optimization (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks (1.00)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Uncertainty > Fuzzy Logic (0.41)
Abstract Automatic history matching may be used to condition reservoir simulation models by including time-lapse seismic data. Stochastic optimization algorithms are used to ensure a good search of the parameter space to ensure proper determination of the best models. These approaches can require many thousands of simulations for large dimensional problems. Divide and conquer is an assisted history matching approach that enables deconvolution of the parameters so that they can be searched more efficiently and also leads to better uncertainty analysis. We present an application of this approach on the Nelson field. Nine years of production history data are used along with baseline and monitor surveys. Localised variations were made to permeability and net:gross. The reservoir was divided into separate parameter regions by combining experimental design and proxy model analysis. The former enabled insignificant parameters to be discarded while the latter showed that each region could be treated as a separate history matching sub-problem. Each sub-problem was then solved simultaneously using an adapted stochastic genetic algorithm. The results showed that a forty-two dimensional problem could be reduced to a combination of three 9D problems and a 3D problem due to the spatial deconvolution of parameters and misfits. An improved match was obtained for the production and seismic data. Compared to a full stochastic search of the parameter space, the number of models was several orders of magnitude smaller. Further, improved uncertainly analysis was made possible resulting in better understanding of the future behaviour of the reservoir. An improved match to reservoir models leads to better confidence in their prediction and thus they can be used more effectively in reservoir management. The method presented here to improve the match retains the benefits of stochastic searching without the penalty of requiring an impractical number of simulations.
- North America > United States > Texas > Andrews County (0.61)
- North America > Canada > Alberta > Wetaskiwin County No. 10 (0.61)
- North America > Canada > Alberta > Ponoka County (0.61)
- Europe > United Kingdom > North Sea > Central North Sea (0.61)
- Geophysics > Time-Lapse Surveying > Time-Lapse Seismic Surveying (1.00)
- Geophysics > Seismic Surveying (1.00)
- Europe > United Kingdom > North Sea > Central North Sea > Central Graben > Block 22/7 > Nelson Field > Forties Formation (0.99)
- Europe > United Kingdom > North Sea > Central North Sea > Central Graben > Block 22/6a > Nelson Field > Forties Formation (0.99)
- Europe > United Kingdom > North Sea > Central North Sea > Central Graben > Block 22/12a > Nelson Field > Forties Formation (0.99)
- (3 more...)
Abstract Steam injection is the primary technique used to produce heavy oil reservoirs. The technology, used either as a continuous injection or cyclic steam, has been successfully applied in Chevron’s San Joaquin Valley fields since the mid 1960’s. In the Diatomite reservoirs cyclic steam was introduced in the mid 1990’s and used profitably ever since to increase recovery from these assets. In the early times the field practices manually solved the production and steam optimization problem on a daily basis while taking into consideration all the operational constraints. As the number of wells became larger (>500) manually resolving issues became difficult. In 2003 a scheduling tool based on Genetic Algorithm (GA) was successfully deployed and is still being used. As the field has matured, the well’s individual production curve behavior has been changing from an idealistic exponential decline to a more complex profile. These new conditions together with other tool limitations prompted the opportunity to develop a new approach that would increase the success of the scheduling tool. A new approach, analogous to the Type Curves Concept (TCC) for cycle production response is introduced in this article. The methodology uses Neural Networks (NN) to identify different cycle production shape patterns. Data from 500+ wells and more than 12,000 cycles were used to identify the type curves or shape patterns. A meticulous classification of raw data revealed fourteen patterns, which were reduced to four representative patterns for further analysis and modeling. Significant data processing such as ID cycles, interpolation, normalization for amplitude and cycle-time, as well as selection of input parameters, were performed in order to train the Neural Network. To complete the model, heuristics rules were inferred to optimize well candidate selection. The hybrid NN-rule based application was coded and integrated with the scheduler and is currently being field-tested. Preliminary results indicate an 80% success rate while the business value analysis showed significant optimization potential. The paper outlines the methodology used in the study including patterns identification, model training, rule-based system and concludes with early deployment results. The hybrid system proved to provide better steam allocation and minimize production loss. Lastly, challenges and lessons learned during the development and deployment are summarized.
- Reservoir Description and Dynamics > Improved and Enhanced Recovery > Thermal methods (1.00)
- Production and Well Operations > Well & Reservoir Surveillance and Monitoring (1.00)
- Data Science & Engineering Analytics > Information Management and Systems > Neural networks (1.00)
- Data Science & Engineering Analytics > Information Management and Systems > Artificial intelligence (1.00)