SPE Disciplines
Geologic Time
Journal
Conference
Author
Concept Tag
Country
Industry
Oilfield Places
Technology
File Type
Layer | Fill | Outline |
---|
Theme | Visible | Selectable | Appearance | Zoom Range (now: 0) |
---|
Fill | Stroke |
---|---|
Steam generation for the purposes of thermal recovery includes facilities to treat the water (produced water or fresh water), generate the steam, and transport it to the injection wells. A steamflood uses high-quality steam injected into an oil reservoir. The quality of steam is defined as the weight percent of steam in the vapor phase to the total weight of steam. The higher the steam quality, the more heat is carried by this steam. High-quality steam provides heat to reduce oil viscosity, which mobilizes and sweeps the crude to the producing wells.
The relationship between seismic velocities and mechanical properties is a strong one. Moduli, such as bulk modulus (and its inverse, compressibility), rigidity (or shear modulus), and Young's modulus, can be determined either from static (very slow) experiments or dynamic experiments, involving the passage of a seismic wave through the sample. Eqs. 1 and 2 are correct only for isotropic media and are strictly appropriate only for moduli measured at the same frequency and amplitude as the seismic wave. Investigators often ignore these distinctions and use the seismically determined moduli to approximate the static moduli sought by reservoir or completions engineers for compaction drive estimates or hydraulic-fracture design. When properly calibrated, the spatial or temporal variations in velocity-derived moduli can often be used to indicate changes in static moduli.[1]
Dynamic data is information that changes asynchronously as the information is updated. Unlike static data, which is infrequently accessed and unlikely to be modified, or streaming data, which has a constant flow of information, dynamic data involves updates that may come at any time, with sporadic periods of inactivity in between.[1] In the context of reservoir engineering, dynamic data is used during the creation of a reservoir model in conjunction with historical static data. When modeled accurately, any sampling from the conditional distribution would produce accurate static and dynamic characteristics. When a permanence of ratio hypothesis is employed, the conditional probability P(AǀB,C) can be expressed in terms of P(A), P(AǀB), and P(AǀC).
DNA fingerprinting (also called DNA typing) involves isolating and creating images of DNA (deoxyribonucleic acid) sequences. In the context of reservoir characterization, DNA samples from microorganisms found living in crude oil are examined.[1] The process for creating a DNA fingerprint consists of first obtaining a sample of cells containing, extracting the DNA from the sample, and purifying the DNA. Fragments of different lengths are produced then sorted by placing them on a gel and subjecting the gel to an electric current in a process called electrophoresis: the shorter the fragment, the more quickly it will move toward the positive pole (anode). After electrophoresis, the sorted, double-stranded DNA fragments are subjected to a blotting technique in which they are split into single strands and transferred to a nylon sheet.
The following are some of the most common sources of error in directional drilling. The survey instrument's performance depends on the package design elements, calibration performance, and quality control during operation. System performance will functionally depend on the borehole inclination, azimuth, geomagnetic-field vector, and geographical position. Because of the dependency on sensing Earth's spin rate, the performance of gyro compassing tools is inversely proportional to the cosine of the latitude of wellbore location. For magnetic tools, high latitudes result in weaker horizontal components of Earth's field.
As the quest for new petroleum supplies has increased in the past few years, operators have been forced to drill deeper to find new reserves. Much of the higher cost of drilling deeper, especially onshore, is typically associated with decreased rate of penetration (ROP) caused by both harder rock and higher mud weights required to counter the overpressured reservoirs often associated with deeper drilling. The following discussion centers on technologies intended to enhance the deep drilling capability. Industrial hammers for hard rock drilling have been around for some time, but most have been air operated and used in the mining industry. Historically, hammers have been thought to have limited capability in oil and gas drilling operations, with their use limited to air drilling.
Cyber security affects every aspect of modern life, from personal emails to identity information, corporate payrolls, or even national intelligence secrets. Within the oil and gas industry, cybercrime such as data breaches can cause major threats to organizations. Supervisory Control and Data Acquisitions (SCADA) and Drilling Control Systems (DCS) are two control systems that are critical to the infrastructure of the oil and gas industry. Originally designed to be in physically secured, isolated areas, these systems must now be integrated with IT networks.[1] This integration, along with the increased use of commercial off the shelf components (COTS), like USB drives and Windows, has decreased the isolation of SCADA and DCS, thus increasing vulnerability to attacks.
Drilling with casing is not a new concept; it has been used in the mining and water-well industries [1] for many years. However, modifying the tools and materials for oilfield use and extending drilling depth beyond a few thousand feet is new. This new approach, called Casing Drilling TM, was developed [2] and field tested [3] and culminated in a successful demonstration to 9,500 ft early in 2002 in South Texas by Tesco Corp. and its partner, Conoco. The demonstration was the result of 5 years of development that included development of tools for directional drilling.[4] The demonstration resulted in an actual overall drilling time reduction of 17.5% and a potential for as much as a 33% reduction.[5]
Reservoir geophysics should aggressively take advantage of data from boreholes that are very close to the target itself, not just for correlating seismic data to the well but also using those wells for the collection of novel geophysical data from below the noisy surface or weathered zone. New techniques for acquisition of seismic data from wellbores are available, and should become routine tools in the arsenal of the reservoir geophysicist. Sonic logging has become routine, and the collection of compressional and shear velocities in fast and slow formations is more-or-less straightforward, particularly with the use of dipole sonic tools and waveform processing. The application of modified sonic-logging tools for imaging near the wellbore is not routine but has been demonstrated in several cases; research and development continues in this area. Modern sonic logging tools can provide a good measure of compressional and shear velocities, values that are required for calibrating seismic data at wells and for the investigation of lithology and fluid content from seismic data.
Biological monitoring is the continued examination of biological specimens taken from a specific environment[1] to identify any human-caused issues. In the oil and gas industry, biological monitoring programs provide important data for decision making and to ensure the protection of resources and ecosystems.[2] To evaluate ecosystems, flora, fauna, arthropods, birds, small mammals, and other species are examined within specific sampling zones to determine the effect on their respective habitats and density within those habitats. Tissue samples are analyzed to determine the effect of chemicals on specific species. Air, water, and soil samples are tested for signs of environmental toxicity. Based on findings from data collected, a plan is then created to prevent potential or further damage to the zones affected by the drilling area.