**Current Filters**

**Source**

**Conference**

**Publisher**

**Theme**

**Author**

- Aaron, Peter (1)
- Abushaala, Lamin (1)
- Agudelo, William (1)
- Alerini, Mathias (1)
- Ali, Hafedh Ben-Hadj (1)
- Amazonas, D. (1)
- Anno, Phil (1)
- Audebert, Francois (1)
- Audebert, François (1)
- Baldock, Simon (1)
- Baofu, Chen (1)
- Becerra, Carlos (1)
- BenGheit, Ali Omar (1)
- Benkovis, Laszlo (1)
- Bevc, D. (1)
- Biondi, B. (1)
- Biondi, Biondo (1)
- Bisley, Richard (1)
- Biswal, Ajoy (1)
- Boonyasiriwat, Chaiwoot (1)
- Bradford, John (1)
- Burgess, Brian (1)
- Cai, Wenying (2)
- Camez, J.C. (1)
- Cao, Weiping (1)
- Castagna, John (1)
- Chang, Itze (1)
- Chunhua, Cheng (1)
- Comeaux, Lynn (1)
- Costa, J.C. (1)
- Cuenca, Johnni Cestari (1)
- Dahl, Sverre Brandsberg (1)
- Dehghan, Kaveh (2)
- Dingyu, Xiong (1)
- Donati, Maria S. (1)
- Dong, Liangguo (1)
- Duquet, B. (1)
- Duquet, Bertrand (1)
- Fliedner, M.M. (1)
- Fliedner, Moritz (1)
- Fromyr, Eivind (1)
- Fruehn, Juergen (1)
- Gancarski, S. (1)
- Giraud, Luc (1)
- Gomes, E.N.S. (1)
- Guan, Huimin (1)
- Guevara, Saul (1)
- Guillaume, Patrice (1)
- Guitton, Antoine (2)
- Guo, Manhong (1)
- Gurch, Michael (1)
- Haidar, Azzam (1)
- Hall, Matt (1)
- Hinz, Emily (1)
- Hole, John A. (1)
- Hu, Chaoshun (1)
- Huang, Tony (1)
- Imhof, Matthias G. (1)
- Ionescu, Gentiana (1)
- Ivanov, Julian (1)
- Jiang, Fan (1)
- Jiao, Junru (1)
- Jones, Ian F. (1)
- Journel, Andre G. (1)
- Jousselin, Pierre (1)
- Kapoor, Jerry (1)
- la Cruz, Miguel (1)
- Lambaré, Gilles (1)
- Le Moigne, José (1)
- Le Stunff, Y. (1)
- Le Stunff, Yves (1)
- Leblanc, Olivier (1)
- Lemaistre, L. (1)
- Lescoffit, Séverine Pannetier (1)
- Li, Peiming (1)
- Li, Zhiming (3)
- Liao, Qingbo (2)
- Lin, Sonny (1)
- Liu, Hui (1)
- Liu, Yongshe (1)
- Liu, Yuzhu (1)
- Loveday, David C. (1)
- Lundy, Ashley (1)
- MacNeill, Malcolm (1)
- Macy, Brian (1)
- Markiewicz, Richard D. (1)
- Martínez, Jaime Ramos (1)
- Mason, Chuck (1)
- Mathur, Mohit (1)
- McIntosh, Kirk (1)
- Melo, A. (1)
- Menyoli, Elive M. (1)
- Miller, Becky (1)
- Miller, Richard D. (1)
- Mitouard, Pierre (1)
- Montel, Jean Philippe (1)
- Mukerji, Tapan (1)
- Munoz, Pedro (1)
- Nichols, D. (1)
- Operto, Stéphane (2)
- Ortigosa, Francisco (4)
- Osypov, K. (1)
- Prescott, Tony (1)
- Ravaut, Céline (1)
- Richardson, Mike (1)
- Rodriguez, Gary (2)
- Routh, Partha (1)
- Roy, Baishali (1)
- Sangvai, Pranaya (1)
- Schleicher, J. (1)
- Scholz, Scott (1)
- Schuster, Gerard T. (1)
- Shaw, Simon (1)
- Siliqi, Risto (1)
- Silva, F.J.C. (1)
- Sourbier, Florent (2)
- Specht, Jim (1)
- Stekl, Ivan (1)
- Stoffa, Paul (1)
- Tang, Cristina Reta (1)
- Thomassen, Espen (1)
- Umpleby, Adrian (1)
- Valasek, Paul (1)
- Valler, Victoria (1)
- van Borselen, Roald (1)
- Vidal, Nicolas (1)
- Virieux, Jean (2)
- Wang, Bin (1)
- Wang, Juan (1)
- Warner, Mike (1)
- Washah, Abulrazag M. (1)
- Whiteside, Wilfred (1)
- Whitmore, Dan (1)
- Wijnen, Peter (1)
- Woodward, M. (1)
- Woodward, Marta (1)
- Xia, Jianghai (1)
- Xiaoqiao, Ren (1)
- Xie, Xiao Bi (1)
- Xu, Sheng (1)
- Xu, Wenlong (1)
- Yang, Hui (1)
- Yarman, C.E. (1)
- Yoon, Kwangjin (1)
- Zdraveva, O. (1)
- Zhang, Rui (1)
- Zhang, Xiaoming (1)
- Zhou, Chaoguang (1)
- Zhou, Hua Wei (1)
- Zhu, Xianhuai (1)
- Zimine, Serge (1)

to

Go **Concept Tag**

- acquisition (4)
- algorithm (4)
- amplitude (2)
- analysis (4)
- anisotropy (3)
- application (4)
- approach (5)
- Artificial Intelligence (1)
- Australia (1)
- azimuth (4)
- backprojection (2)
- backprojection point (2)
- Billette (1)
- Bootstrap (1)
- building (5)
- Case History (1)
- case study (2)
- channel (2)
- Checkerboard (1)
- Checkerboard test (1)
- CIG (1)
- condition (2)
- constraint (2)
- correction (2)
- data (9)
- depth (8)
- direction (2)
- DLT model (1)
- domain (3)
- equation (3)
- estimation (2)
- Figure (18)
- formation evaluation (17)
- frequency (5)
- function (2)
- gather (6)
- geologic modeling (2)
- geophysics (5)
- image (11)
- Imaging (9)
- information (2)
- inverse problem (2)
- inversion (14)
- iteration (2)
- knowledge management (4)
- layer (4)
- meeting (6)
- method (8)
- migration (11)
- model (26)
- Model Building (2)
- noise (2)
- parameter (3)
- point (2)
- PreSDM (3)
- prestack (2)
- problem (3)
- PSDM (4)
- ray (6)
- ray path (2)
- receiver (2)
- reference (7)
- reflection (4)
- reflector (3)
- regularization (4)
- Reservoir Characterization (17)
- reservoir description and dynamics (17)
- reservoir simulation (14)
- resolution (3)
- salt (8)
- seg las vegas (14)
- seismic modeling (2)
- seismic processing and interpretation (16)
- semblance (2)
- sensitivity (2)
- sensitivity kernel (2)
- shot (2)
- slowness (2)
- solution (3)
- source (5)
- subsalt (2)
- survey (5)
- time (4)
**tomography (39)**- tool (2)
- traveltime (10)
- Traveltime Tomography (2)
- uncertainty (2)
- Upstream Oil & Gas (17)
- variation (2)
- velocity (19)
- velocity model (14)
- velocity model building (2)
- water (2)
- Wave (3)
- wavefield (2)
- wavepath (2)
- Wavepath tomography (2)
- well (2)
- workshop organizer (2)

to

GoFruehn, Juergen (ION GX Technology) | Jones, Ian F. (ION GX Technology) | Valler, Victoria (ION GX Technology) | Sangvai, Pranaya (Reliance Industries Ltd.) | Mathur, Mohit (Reliance Industries Ltd.) | Biswal, Ajoy (Reliance Industries Ltd.)

Imaging in deep water environments poses a specific set of challenges, both in the data pre-conditioning and the velocity model building. These challenges include scattered complex 3D multiples, aliased noise, and low velocity shallow anomalies associated with channel fills and gas hydrates. In this paper, we describe an approach to tackling such problems for data from deep water off the north east coast of India, concentrating our attention on iterative velocity model building, more specifically the resolution of near surface and other velocity anomalies. In the region under investigation, the velocity field is complicated by narrow buried canyons (500 m wide) filled with low velocity sediments which give rise to severe pull-down effects; possible free gas accumulation below an extensive gas hydrate cap, producing dimming of the image below (perhaps as a result of absorption); and thin channel bodies with low-velocity fill. Hybrid gridded tomography using a conjugate gradient solver (with 20 m vertical cell size) is used to resolve small scale velocity anomalies (with thicknesses of about 50m). Manual picking of narrow channel features is used to define bodies too small for the tomography to resolve. Pre-stack depth migration using a velocity model built using a combination of these techniques was able to resolve pull-down and other image distortion effects in the final image. The resulting velocity field shows high resolution detail useful in identifying anomalous geobodies of potential exploration interest.

accumulation, channel, depth, Figure, formation evaluation, gas hydrate, gather, image, Imaging, migration, model, PreSDM, reservoir simulation, resolution, sea bed, sub-sea system, subsea system, tomography, velocity, velocity anomaly, velocity profile, water

SPE Disciplines: Reservoir Description and Dynamics > Reservoir Characterization > Seismic processing and interpretation (1.00)

It is often claimed that the spatial resolution limit for traveltime tomography is the first Fresnel zone associated with the dominant frequency in the data. To test this assertion, synthetic seismic data were generated for traveltime picking and inversion for a single, small velocity anomaly embedded in a homogeneous background velocity. All picking methods produced accurate raytheoretical (infinite-frequency) picks from noise-free seismic data for objects much smaller than the dominantfrequency Fresnel zone. All methods detected the presence of objects smaller than a wavelength. The inversion of these traveltime picks always recovered the position and shape of the object. Random noise was added to the synthetic seismic data. Pick times with different noise realizations were statistically centered on the noise-free pick, not the time that would be recorded in the absence of the object. Trace stacking prior to picking or the averaging of many picks improves the signal-to-noise ratio and can extract signal that is not resolved on an individual pick. An averaging of traveltime picks also occurs during tomographic inversion. This inherent signal-to-noise improvement allows tomography to image small objects that are not resolved in individual trace picks. The resolution of tomography is limited not by the Fresnel zone associated with the dominant frequency, but by the accuracy of the traveltime picks and the density of ray coverage.

data, Figure, formation evaluation, frequency, Fresnel zone, geophysics, geophysics Journal, inversion, method, noise, object, pick, resolution, tomography, traveltime, traveltime pick, Traveltime Tomography, velocity, wavelet

SPE Disciplines: Reservoir Description and Dynamics > Reservoir Characterization > Seismic processing and interpretation (1.00)

Tomographic velocity model building has become an industry standard for depth migration. Anisotropy of the Earth challenges tomography because the inverse problem becomes severely ill-posed. Singular value decomposition (SVD) of tomographic operators or, similarly, eigendecomposition of the corresponding normal equations, are well known as a useful framework for analysis of the most significant dependencies between model and data. However, application of this approach in velocity model building has been limited, primarily because of the perception that it is computationally prohibitively expensive, especially for the anisotropic case.

In this paper, we extend our prior work (Osypov et al., 2008) to VTI tomography, modify the process of regularization optimization, and propose an updated way for uncertainty and resolution quantification using the apparatus of eigendecomposition. We demonstrate the simultaneous tomographic estimation of VTI parameters on a real dataset. Our approach provides extra capabilities for regularization optimization and uncertainty analysis in anisotropic model parameter space which can be further translated into the structural uncertainty within the image.

Velocity model building is one of the most challenging problems in modern depth imaging. 3D tomographic analysis has become the key technology to tackle this problem (Woodward et al., 1998). However, the tomographic inverse problem is ill-posed, which leads to big uncertainties and ambiguities in the reconstruction. The necessity to account for anisotropy in seismic velocities complicates this issue even further (Stunff et al., 1999; Grubb et al., 2001; Liu et al., 2004; Zhou et al., 2004). To circumvent this, various forms of regularization are used (e.g. (Tikhonov and Arsenin, 1977; Scales, 1987; Osypov and Scales, 1996; Yao et al., 1999)). However, regularizing tomography still remains a subjective virtue, if not black magic. An alternative way to solving the inverse problem by means of a Bayesian framework for model estimation requires knowledge of prior information and data uncertainty (Tarantola, 1987; Scales and Tenorio, 2001). For 1-D problems, Malinverno and Parker (2006) proposed using an empirical Bayes approach, when model priors and data uncertainties are considered as hyperparameters to be optimized. In general, for seismic 3D seismic tomography, prior information can come from other geophysical and borehole data, or more often as a geoscientist''s input (Bear et al., 2005). In the latter approach it becomes difficult to rigorously quantify this input in terms of probability functions.

Singular value decomposition (SVD) provides an elegant framework for analysis of most significant dependencies between model and data. Truncated SVD can be used as a form of regularization (Jackson, 1972; Lines and Treitel, 1984; Scales et al., 1990) and resolution-covariance trade-off can be analyzed (Kalscheuer and Pedersen, 2007). For big problems when direct SVD solution is not computationally feasible, various iterative schemes for partial SVD have been proposed, first pioneered by Lanczos (1950) and Arnoldi (1951), and then developed further by Parlett (1980), Cullum and Willoughby (1986), Sorensen (1992), to name a few. Instead of SVD of the tomographic operator one can do eigendecomposition of the corresponding symmetric positive-definite matrix from normal equations.

SPE Disciplines:

Technology:

- IT > AI > Representation & Reasoning > Uncertainty > Bayesian Inference (0.34)
- IT > AI > Machine Learning > Bayesian Networks (0.34)

In this paper we describe the definition and applications of structurally conformable transversely isotropic (STI) velocity models in seismic imaging. After describing some geological aspects of STI media, we present the effects of this kind of anisotropy upon processing and imaging. We illustrate our discussion with synthetic and real examples showing some large structural effects in these media and the errors being introduced if we represent them with VTI models.

We used seismic modeling and imaging as effective tools to verify structural interpretation of a seismic section from geological complex area at an onshore field of Venezuela. By analyzing PSTM image problems and combined with a nearby well information we derived a new structure/velocity model, and re-produced the image problems by pre-stack time migration of the synthetic seismic data acquired from this new structure model. The comparison of PSTM images of synthetic data and the real data verified the new interpretation concept and confirmed that the target structure was distorted by the overriding thrust faults in the PSTM images and PSDM has to be adopted to produce accurate structure image. To select an adequate PSDM velocity model building workflow for this specific imaging objective the same synthetic dataset has been used to evaluate different PSDM model building approaches. Both grid and model based tomography workflows have been evaluated. From the comparison between PSDM images of the two different approaches we concluded that grid tomography without structure surfaces constraint is inadequate and model-based velocity update method is suitable to achieve the imaging objective.

data, fault, Figure, formation evaluation, geologic modeling, image, Imaging, interpretation, model-based reasoning, modeling application, PSDM, PSTM, PSTM image, reservoir simulation, seismic modeling, structure, structure interpretation, target, tomography, velocity, velocity model, velocity update

SPE Disciplines: Reservoir Description and Dynamics > Reservoir Characterization > Seismic processing and interpretation (1.00)

This study aims at building high resolution velocity/slowness models by integrating coarse-scale travel-time tomography data with fine-scale well data and accounting for prior information, such as spatial correlation and statistics (mean, variance, etc). Three approaches, kriging, direction sequential simulation and error simulation, are proposed and compared. Kriging estimation and stochastic simulation are performed in the original data space, which allows preserving the linearity of tomographic average data. These three algorithms are developed into three plug-ins for the Stanford Geostatistical Modeling Software (SGeMS), which is a public-domain geostatistical modeling platform (Remy et al., 2008). The three plug-ins are block kriging estimation (

algorithm, block, data, estimation, Figure, formation evaluation, geologic modeling, histogram, program, ray, ray path, realization, reference, reference model, Reproduction, reservoir simulation, result, tomography, variogram, well, well data

SPE Disciplines:

- Reservoir Description and Dynamics > Reservoir Characterization > Geologic modeling (1.00)
- Reservoir Description and Dynamics > Reservoir Characterization > Seismic processing and interpretation (0.95)
- Reservoir Description and Dynamics > Formation Evaluation & Management > Cross-well tomography (0.83)

We propose a multiscale method for time-domain waveform tomography. The data are low-pass filtered using a Wiener filter, and the inversion is carried out using a multiscale method with a dynamic early-arrival muting window. The adjoint method is used in the inversion for an efficient computation of the gradient directions. A staggered-grid, explicit finite-difference method with 4

application, data, formation evaluation, frequency, geophysics, image, inversion, model, Multiscale, Multiscale Method, Multiscale Waveform Tomography, reservoir simulation, scaling method, Software Engineering, tomogram, tomography, traveltime, velocity, velocity model, waveform, waveform tomogram, waveform tomography

We present a 3D massively parallel frequency-domain fullwaveform tomography algorithm based on a domain decomposition method and a hybrid direct-iterative solver for the wave propagation forward problem. The hybrid solver is used to overcome the memory overburden of direct solver while remaining efficient for multiple-shot simulations required by tomographic applications. We compare the performances of two 3D parallel waveform tomography algorithms based on direct and hybrid solvers respectively using a small case study. The results confirm the memory saving provided by the hybrid solver and its higher CPU cost resulting from the use of the iterative solver for multiple-shot simulations.

We present a massively parallel frequency-domain full waveform tomography (FWT) algorithm for imaging 3D acoustic media based on a hybrid direct-iterative solver. FWT refers to imaging method based on the complete solution of the twoway wave equation for the forward problem and on inverse problem theory for the imaging problem (Tarantola, 1987). Frequency-domain FWT was originally developed for 2D crosshole acquisition surveys which involve wide-aperture propagations (Pratt and Worthington, 1990). In this framework, only few discrete frequencies are required to image the medium thanks to the wavenumber redundancy provided by multifold wide-aperture geometries (Sirgue and Pratt, 2004).

Wave propagation modeling is a critical issue in FWT since it is the most computationally expensive task in the processing flow. In the frequency domain, the forward problem reduces to the resolution of a large sparse system of linear equations per frequency whose right-hand side term is the source and the solution is the monochromatic wavefield. In 2D, this linear system is classically solved with a direct solver to efficiently compute multiple solutions once the impedance matrix was LU factorized. In 3D, the time and memory complexities of direct solver may prevent application to computational domains involving several tens of millions of unknowns. A feasibility study of 3D frequency-domain full-wave modeling and inversion based on direct solver was presented in Operto et al. (2007); BenHadjAli et al. (2008) who showed that low frequencies (< 7 Hz) can be tackled with such approach on realistic case studies using currently available PC cluster. Alternative forward modeling approaches were proposed to perform 3D frequency-domain FWT based either on time-domain methods (Sirgue et al., 2007; Nihei and Li, 2007) or iterative solvers (Warner et al., 2007; Riyanti et al., 2007). Advantages of these approaches are their lower memory requirements and their good scalability. However, their computational time linear increases with the number of sources.

In this paper, we investigate a domain decomposition method (DDM) based on a hybrid direct-iterative solver to perform the forward problem in 3D FWT. The objective is to overcome the memory overburden of direct solvers while remaining efficient for multiple-shot simulations required by FWT. The first objective is reached by applying the direct solver to submatrices of small dimension, each submatrix corresponding to a subdomain of the computational domain. The second objective is achieved by applying the parallel iterative solver to a wellconditioned system of limited size corresponding to the number of interface nodes between subdomains.

algorithm, computational, domain, domain decomposition, Figure, formation evaluation, frequency, FWTH, inversion, knowledge management, memory, method, model, modeling, number, problem, processor, reservoir simulation, solver, source, tomography, velocity

Technology:

- IT > Scientific Computing (0.56)
- IT > Hardware > Memory (0.34)

Over the past decade, the majority of deep water blocks in the Gulf of Mexico have been covered multiple times with seismic data from narrow-azimuth, towed-streamer acquisition (NAZ). In complex subsalt areas, each NAZ dataset provides unique subsurface illumination benefits. Multiple-azimuth data are now frequently integrated to provide extended subsurface coverage and for better imaging of complex subsalt structures. Multiple-azimuth seismic data, with shot and receiver locations covering a large portion of the two dimensional surface, presents a new challenge for deriving a single velocity model that satisfies both datasets.

Velocity variation with azimuth is observed in an orthogonal dual-azimuth streamer dataset in Deep Water Gulf of Mexico. This paper presents the benefit of tilted transversely isotropic (TTI) tomography to yield an anisotropy model that flattens gathers for all azimuths as well as improves focusing and spatial positioning of steeply-dipping salt flanks.

anisotropy, azimuth, data, Figure, formation evaluation, gather, image, Imaging, improve, meeting, migration, model, reference, salt flank, survey, tomography, TTI tomography, velocity, velocity model

The low velocity propagation of the seismic waves in the first meters of the sub-surface and the topographic irregularities have an effect in the correct position of deep reflectors on seismic images. Velocity models builded using algorithms of tomographyc inversion have shown to be useful to correct this effect, but the solution has associated uncertainties. Three techniques of uncertainty analysis are applied on a tomographical inversion of real near surface data, giving a measurement of the associated error. These tecnhiques are computationally expensive, because they explore intensively the model space. Eventhough, with the use of current high-performing computers, their practical use have been made possible.

Seismic tomography tecnhiques are usually used to obtain velocity models of the weathering layers which is further applied to calculate statics corrections. Several tomographical techniques have been developed with different methologies of ray tracing or inverse problem solution. The main idea is to use the misfit between computed and observed first arrivals times to correct an initial velocity model. Due to the nature of the seismic method that is generally acquired in only one edge of the medium, it have not a unique solution. It is possible have a family of solutions that explain the same dataset. To give a meaning to a non-unique solution, uncertainty analysis techniques are necessary. They can give additional information about the solution such as: which zones in the are better determined? which zones are better resolved? Is there a dispersion measurement (such as variance) associated with the solution model family? Is there a more probable solution model?. In this case study we have applied four methodolies: jackknife, checkerboard test, Monte Carlo, Bootstrap.

The seismic data used was acquired in the Catatumbo basin (Norte de Santander - Colombia). It is a zone with rough topography, where the weathering layer has high thickness and velocity variablility due to the particular conditions in these zones: the tropical weather, the solar energy, the high precipitation regime, the humidity and the temperature. This variablility have a strong effect on seismic image, which have, in general, a low SNR.

In order to perfom a seismic tomography the medium should be discretized. The 2-D velocity field is parameterized as a mesh of nodes hanging from the topography where the node spacing can vary laterally and vertically. The forward problem of refracted arrivals is solved using a hybrid ray-tracing scheme based on the graph method with a local ray bending refinement. For the inverse problem we have used the iterative matrix solver LSQR based in conjugate gradients. To regularize the iterative linearized inversion we employ smoothing constraints using predefined correlation lenghts and damping constraints. For further details of this method see Korenaga (2000).

Jackkniffing Method Jackkniffing was proposed by Quenouille in 1956 and developmented by Tukey in 1957. It is a procedure to estimate the uncertainty of all the set parameter of the model Less (1989) White (1990). In our application, data set was divided in shot gathers.

Oilfield Places:

- North America > United States > West Virginia > Volcano Oil Field (0.98)
- Europe > Iceland > North Atlantic Ocean > Iceland Basin (0.91)

SPE Disciplines:

Thank you!