**Source**

**Conference**

**Theme**

**Author**

- Chen, Hanming (1)
- Chen, Xiaohong (3)
- Chen, Yangkang (12)
**Gan, Shuwei (12)**- Jiao, Shebao (2)
- Jin, Zhaoyu (3)
- Li, Qingqing (1)
- Li, Summer Xia (1)
- Liu, Tingting (1)
- Wang, Erying (1)
- Wang, Shoudong (4)
- Wang, Yufeng (1)
- Xiang, Kui (1)
- Yang, Wencheng (3)
- Yuan, Jiang (1)
- Zhang, Chenglin (1)
- Zhang, Dong (1)
- Zhang, Guoyin (2)
- Zhang, Yizhuo (2)
- Zhou, Hui (1)

**Concept Tag**

- acquisition (1)
- algorithm (1)
- annual international meeting (1)
- annual meeting (2)
- application (1)
- Artificial Intelligence (5)
- attenuation (1)
- average spectrum (1)
- bandlimited signal-and-noise orthogonalization (1)
- bandpass (1)
- Boonsville Field (1)
- case study (1)
- Chen (2)
- constraint (1)
- convex (1)
- convolutional (1)
- convolutional operator (1)
- crossline 75 (1)
- dash line correspond (1)
- data example (1)
- data quality (4)
- de-aliased seismic data interpolation (1)
- deblended data (1)
- decomposition (1)
- denoised data (1)
- Denote (2)
- dip estimation (1)
- dipping-event retriever (1)
- distance separated simultaneous-source (1)
- EMD (2)
- empirical mode decomposition (2)
- equation (1)
- estimation (1)
- fast projection (1)
- field data (3)
- formulation (1)
- fractional time derivative (1)
- frequency (2)
- frequency slice (1)
- geophysics (3)
- ground roll (1)
- ground roll attenuation (1)
- HRSC framework (1)
- hybrid approach (1)
- hybrid rank-sparsity constraint (1)
- hz low-pass (1)
- interpolation (2)
- iteration (2)
- karstification (1)
- Komatitsch (1)
- local slope (2)
- machine learning (2)
- Martin (1)
- matching (1)
- matching filter (1)
- MSSA (2)
- multidimensional seismic (1)
- multiple constraint (1)
- NMF (1)
- noise (2)
- noise section corresponding (1)
- non-stationary matching (1)
- operator (3)
- oz-25 dataset (1)
- plane wave (1)
- pnmo-mf-fk filter (1)
- prediction (1)
- primary reflection (1)
- processing window (1)
- random noise (2)
- random noise attenuation (2)
- rank reduction method (1)
- reconstruction (2)
- reflection (1)
- regularization (1)
- Reservoir Characterization (12)
- resolution (1)
- Sacchi (1)
- section corresponding (1)
- SEG (1)
- seislet frame (1)
- seislet transform (1)
- seislet transform domain (1)
- seismic data (6)
- seismic data reconstruction (2)
- seismic event (1)
- seismic profile (1)
- seismic reflection (1)
- selective hybrid approach (1)
- separated simultaneous-source data (1)
- shot domain (1)
- shot gather (1)
- simulated field data (1)
- slope estimation (1)
- spectrum (2)
- SSWT (1)
- TF analysis (1)
- TF analysis approach (1)
- Upstream Oil & Gas (11)
- Wave Equation (1)

**File Type**

We first demonstrate our works on multi-dimensional seismic data reconstruction with multiple constraints, then we provide some perspectives and insights on this framework from our experience, especially the new combinations and further applications of multiple constraints when facing complicated seismic data. The core in seismic reconstruction is the choice of constraint method. Recently, there are two popular approaches to design such a constraint: sparsity-promoting transform using sparsity constraint and rank reduction method using rank constraint. While the sparsity-promoting transform enjoys the advantage of high efficiency, it lacks adaptivity to various data patterns. On the other hand, rank reduction method can be adaptively applied to different datasets but its computational cost is quite expensive. In this abstract, we propose multiple constraints for seismic data reconstruction based on a novel hybrid rank-sparsity constraint (HRSC) model which aims at combining the benefits of both approaches. Also, we design the corresponding HRSC algorithmic framework to effectively solve the proposed new model. Application of the HRSC framework on synthetic and field seismic data demonstrates a superior performance compared with the well-known multi-channel singular spectrum analysis (MSSA).

Presentation Date: Wednesday, October 19, 2016

Start Time: 10:20:00 AM

Location: 166

Presentation Type: ORAL

Artificial Intelligence, constraint, data quality, HRSC framework, hybrid rank-sparsity constraint, interpolation, MSSA, multidimensional seismic, multiple constraint, rank reduction method, reconstruction, Reservoir Characterization, Sacchi, seismic data, seismic data reconstruction, soft percentile, sparsity-promoting transform, Upstream Oil & Gas

SPE Disciplines: Reservoir Description and Dynamics > Reservoir Characterization > Seismic processing and interpretation (1.00)

Technology:

**Summary**

We propose to use a structural-oriented median filter to attenuate the blending noise along the structural direction. The principle of the proposed approach is to first flatten the seismic record in local spatial windows and then to apply a traditional median filter (MF) to the third flattened dimension. The key component of the proposed approach is the estimation of the local slope, which can be calculated by first scanning the NMO velocity and then transferring the velocity to the local slope. Both synthetic and field data examples show successful performance using the proposed approach.

** Introduction**

The principal purpose of simultaneous source acquisition is to faster the acquisition of a larger-density seismic dataset, which saves numerous acquisition cost and increases data quality. The benefits are compromised by the intense interference between different shots (Berkhout, 2008). One way for solving the problem caused by interference is by first-separating and second-processing strategy (Chen et al., 2014a), which is also called deblending (Akerberg et al., 2008; Abma et al., 2010; Huo et al., 2012; Mahdad et al., 2011; Blacquiere and Mahdad, 2012; Beasley et al., 2012; Doulgeris et al., 2012; Mahdad et al., 2012; Bagaini et al., 2012; Li et al., 2013; Chen and Ma, 2014; Chen et al., 2014b; Berkhout and Blacquiere, 2014; Chen, 2014). Another way is by direct imaging and inversion of the blended data by attenuating the interference during inversion process (Verschuur and Berkhout, 2011; Dai and Schuster, 2011; Xue et al., 2014; Chen et al., 2015). Currently, deblending is still the dominant way for dealing with simultaneous-source data.

SPE Disciplines:

Chen, Yangkang (The University of Texas at Austin) | Jin, Zhaoyu (The University of Edinburgh) | Gan, Shuwei (China University of Petroleum, Beijing) | Yang, Wencheng (China University of Petroleum, Beijing) | Xiang, Kui (China University of Petroleum, Beijing)

**Summary**

Simultaneous shooting achieves a tremendously faster acquisition but pose a very challenging problem for subsequent processing because of the interference from the neighbor crews. Separation of different sources, also called deblending, becomes exceptionally important for the overall success of the new acquisition technology. In this paper, we propose a novel deblending approach following the shaping regularization framework. The shaping operator is chosen as a very effective filter, combining pseudo normal-moveout, median filtering, frequency wavenumber filtering (PNMO-MF-FK). We combine the median-filter based deblending approaches and the FK filter based deblending approach with the proposed framework. Instead of simply using the MF to remove blending noise, we apply a pseudo NMO (PNMO) to prepare a relatively flatter profile in advance, which can make the MF more effective. The proposed deblending approach with the PNMOMF- FK filter can obtain excellent performance within small number of iterations. Compared with the alternative MF-FK filter, or FK filter, the PNMO-MF-FK filter can obtain obviously better results. We use a simulated field data example to demonstrate the performance.

**INTRODUCTION**

Simultaneous-source shooting can help acquire better seismic data with tremendously lower acquisition cost, and thus has attracted much attention from the industry in the past several years (Beasley et al., 1998; Berkhout, 2008; Moore et al., 2008; Abma et al., 2010; Beasley et al., 2012; Chen, 2014). In blended acquisition, more than one source is shot simultaneously, regardless of their interactive interference. The term *source* denotes a shot array, which can contain all the shots in a conventional acquisition system. When more than one source is involved in acquisition, either a denser or a wider shot coverage can be obtained for a given constant acquisition period (Chen et al., 2014b). The *wider* coverage actually refers to a higher acquisition efficiency while the *denser* coverage refers to a better-sampled seismic dataset. The compromise for enjoying such benefits is the challenge of processing the blended data caused by intense blending interference. Separation of blended sources into individual sources, also known as deblending, and direct imaging of the blended data, without the need of deblending, are two main ways to deal with the blended data (Chen et al., 2014a). Deblending can provide similar data as the conventional acquisition and thus not require a change in post-processing and imaging algorithms, but need specific computationally expensive technique for the pre-processing (Abma and Yan, 2009; Abma, 2014). Direct imaging (Verschuur and Berkhout, 2011; Xue et al., 2014; Chen et al., 2015) does not require any pre-processing steps for observed data and thus enjoys the benefit of efficiency, but calls for a tremendously different processing workflow (Xue et al., 2014).

SPE Disciplines: Reservoir Description and Dynamics > Reservoir Characterization > Seismic processing and interpretation (1.00)

Gan, Shuwei (China University of Petroleum, Beijing) | Wang, Shoudong (China University of Petroleum, Beijing) | Chen, Yangkang (The University of Texas at Austin) | Chen, Xiaohong (China University of Petroleum, Beijing)

**Summary**

The distance separated simultaneous sourcing technique can make the interference between different source smallest. In a distance separated simultaneous-source acquisition system with two sources, we propose to use a novel iterative seisletframe thresholding approach to separate the blended data. Because the separation is implemented in common shot gathers, there is no need for the random scheduling that was used in conventional simultaneous-source acquisition, where the random scheduling is applied to ensure the incoherent property of blending noise in common midpoint, common receiver, or common offset gathers. Thus, the distance separated simultaneous sourcing becomes more flexible. The separation is based on the assumption that the local dips of the data from different sources are different. We can use plane-wave destruction (PWD) operator to simultaneously estimate the conflicting dips and then use seislet frames with two corresponding local dips to sparsify each signal component. The interference becomes unpredictable noise in the dip-governing seislet transform domain and can thus be removed by soft thresholding. A simulated field data example shows excellent performance of the proposed approach.

**Introduction**

The principal purpose of simultaneous source acquisition is to faster the acquisition of a larger-density seismic dataset, which saves numerous acquisition cost and increases data quality. The benefits are compromised by the intense interference between different shots (Berkhout, 2008). One way for solving the problem caused by interference is by first-separating and second-processing strategy (Chen et al., 2014a), which is also called deblending (Moore et al., 2008; Akerberg et al., 2008; Moore, 2010; Abma et al., 2010; Huo et al., 2012; Mahdad et al., 2011; Blacquiere and Mahdad, 2012; Beasley et al., 2012; Doulgeris et al., 2012; Mahdad et al., 2012; Bagaini et al., 2012; Li et al., 2013; Chen and Ma, 2014; Chen et al., 2014b; Berkhout and Blacquiere, 2014; Chen, 2014). Another way is by direct imaging and inversion of the blended data by attenuating the interference during inversion process (Verschuur and Berkhout, 2011; Dai and Schuster, 2011; Xue et al., 2014; Chen et al., 2015). Currently, deblending is still the dominant way for dealing with simultaneous-source data.

SPE Disciplines: Reservoir Description and Dynamics > Reservoir Characterization > Seismic processing and interpretation (1.00)

Chen, Yangkang (The University of Texas at Austin) | Jiao, Shebao (China University of Petroleum, Beijing) | Gan, Shuwei (China University of Petroleum, Beijing) | Yang, Wencheng (China University of Petroleum, Beijing)

**Summary**

The bandpass filtering is a common way for estimating the ground rolls in land seismic data because of the low-frequency property of ground rolls. However, the frequency-mixing problem makes bandpass filtering inconvenient or even impossible to achieve a successful removal of ground rolls. In this paper, we propose a novel band-limited orthogonalization approach for removing ground rolls without harming useful primary reflections. We orthogonalize the initially denoised signal via bandpass filtering with a relatively high low-boundfrequency (LBF), and the corresponding noise section that contains loss of primary reflections, using local signal-andnoise orthogonalization. The local orthogonalization guarantees that the least coherent primary reflections are lost in the noise section. The procedure of the proposed approach is very convenient to implement because only a bandpass filtering and a regularized division between the initially denoised signal and initial noise are used. The OZ-25 field dataset demonstrates a successful performance using the proposed approach.

**Introduction**

Ground rolls are a type of seismic noise, with a high amplitude, low frequency, and low velocity. Ground rolls are the main type of coherent noise for land seismic survey. Those ocean bottom node (OBN) based marine seismic surveys may also contain this type of noise (Chen et al., 2014). The ground rolls usually mask the shallow reflections at short offset, and deep reflections at larger offset (Claerbout, 1983; Saatilar and Canitez, 1988; Henley, 2003), and must be removed before the subsequent processing tasks. Instead of being incoherent along the spatial direction like random noise (Yang et al., 2015; Chen et al., 2015), the ground rolls are coherent and behave much similarly to the primary reflections, which makes the removal of them very difficult using simple signal processing methods. There have been a lot of researches about removing ground rolls done in the literature and many researchers have proposed different methods for handling the ground rolls problem (Shieh and Herrmann, 1990; Brown and Clapp, 2000). Most of the ground-rolls removal approaches either fail to remove all the ground rolls or remove much useful primary reflections energy. An efficient and effective technique for removing ground rolls are always in demand.

Jiao, Shebao (China University of Petroleum, Beijing) | Chen, Yangkang (The University of Texas at Austin) | Yang, Wencheng (China University of Petroleum, Beijing) | Gan, Shuwei (China University of Petroleum, Beijing) | Wang, Erying (PetroChina Changqing Oilfield Company)

**Summary**

Conventional adaptive-subtraction based ground rolls attenuation approaches first predict an initial model of ground rolls and then adaptively subtract it using a stationary matching filter (MF). Because of the non-stationary property of seismic data and ground rolls, the application of a traditional stationary MF is not physically plausible. Thus, in the case of highly non-stationary seismic reflections and ground rolls, a stationary matching filter (MF) cannot obtain satisfactory results. In this paper, we apply a non-stationary matching filter (NMF) to adaptively subtract the ground rolls. The NMF can be obtained by solving a highly under-determined inversion problem using non-stationary autoregression (NAR). We apply the proposed approach to one synthetic example and two field data examples and demonstrate the much improved performance compared with the traditional MF approach.

**INTRODUCTION**

Ground rolls are a type of seismic noise, with a high amplitude, low frequency, and low velocity. Ground rolls are the main type of coherent noise for land seismic survey. Those ocean bottom node (OBN) based marine seismic surveys may also contain this type of noise (Chen et al., 2014). The ground rolls usually mask the shallow reflections at short offset, and deep reflections at larger offset (Claerbout, 1983; Saatilar and Canitez, 1988; Henley, 2003), and must be removed before the subsequent processing tasks. Instead of being incoherent along the spatial direction like random noise (Chen and Ma, 2014; Yang et al., 2015), the ground rolls are coherent and behave much similarly to the primary reflections, which makes the removal of them very difficult using simple signal processing methods. There have been a lot of researches about removing ground rolls done in the literature and many researchers have proposed different methods for handling the ground rolls problem (Shieh and Herrmann, 1990; Brown and Clapp, 2000). The simplest and the probably most widely used way for removing ground rolls is by bandpass filtering. Because of the low-frequency property, a simple high-pass filter is usually applied to the seismic record to attenuate the ground rolls. However, the low-bound-frequency (LBF) for the bandpass filter is not easy to choose, because higher LBF will cause much damage to primary reflections while lower LBF will fail to remove enough ground rolls. This issue exists because of the coupling of primary reflections and ground rolls in the frequency bands. In order to deal with this problem, we can either make improvement to the traditional frequency-based approach or devise new approaches from different perspectives.

Oilfield Places:

- Asia > China > Shanxi Province > Ordos Basin > Changqing Field (0.99)
- Asia > China > Shaanxi Province > Ordos Basin > Changqing Field (0.99)
- Asia > China > Ningxia Province > Ordos Basin > Changqing Field (0.99)
- (2 more...)

**Summary**

Due to different reasons, the seismic reflections are not continuous even when no faults or discontinuities exist. We propose a novel approach for enhancing the amplitude of seismic reflections and make seismic reflections continuous. We use plane wave flattening to provide highly horizontal events for the following empirical mode decomposition (EMD) based smoothing in the flattened domain. An inverse plane wave flattening can be used to obtain original curved events. The plane wave flattening process requires a precise local slope estimation, which was provided by the plane-wave destruction (PWD) algorithm. The EMD based smoothing filter is a non-parametric and adaptive filter, thus can be conveniently used. Both pre-stack and post-stack field data examples show tremendous improvement for the data quality, which is especially interpretation friendly.

**INTRODUCTION**

Modern exploration for hydrocarbon requires more advanced techniques for each steps of seismic acquisition, processing, and interpretation (Chen et al., 2014a). The key step that connects geophysics and geology is interpretation. A clean prestack dataset can aid AVO-related inversion. A clean postmigrated image will help geologists make right decisions and reduce the interpretation risks (Chen and Ma, 2014). Due to different reasons, e.g. unpredictable noise or sampling issues, the raw seismic data might not be clean or coherent enough.

To solve this problem, we can not use a traditional random noise attenuation technique to make the seismic reflections coherent, because this type of discontinuous noise is not randomly spreaded across the whole profile. Using a traditional random noise attenuation techniques (Yang et al., 2015; Chen et al., 2015), such as the *f -x* deconvolution (Canales, 1984), *f -x* singular spectrum analysis (Oropeza and Sacchi, 2011), Karhunen-loeve (KL) or singular value decomposition (SVD) filtering (Jones and Levy, 1987), will harm much of the useful reflections. The discontinuous problem of some seismic dataset can be solved using a smoothing operator. A mean filter is usually used to smooth the seismic reflections instead of solely removing random noise. However, mean filter requires a exactly flattened reflections, otherwise much useful reflections will be lost. In addition, the parameters setting when utilizing the mean filter requires some effort in practice.

Artificial Intelligence, Chen, data example, data quality, EMD, empirical mode decomposition, geophysics, local slope, machine learning, plane wave, random noise, random noise attenuation, reflection, Reservoir Characterization, section corresponding, seismic data, seismic reflection, Upstream Oil & Gas

Technology:

Gan, Shuwei (China University of Petroleum, Beijing) | Wang, Shoudong (China University of Petroleum, Beijing) | Chen, Yangkang (The University of Texas at Austin) | Chen, Xiaohong (China University of Petroleum, Beijing)

**Summary**

According to the compressive sensing (CS) theory in the signal-processing field, we proposed a new seismic data reconstruction approach based on a fast projection onto convex sets (POCS) algorithm with sparsity constraint in the seislet transform domain. The FPOCS can obtain much faster convergence than conventional POCS (about two thirds of conventional iterations can be saved). The seislet transform based reconstruction approach can achieve obviously better data recovery results than *f –k* transform based scenarios, considering both signal-to-noise ratio (SNR) and visual observation, because of a much sparser structure in the seislet transform domain. Both synthetic and field data examples demonstrate the performance of the proposed approach.

**Introduction**

Due to different reasons, seismic data may have missing traces. Seismic data reconstruction is such a procedure to remove sampling artifacts, and to improve amplitude analysis, which is very important for subsequent processing steps including highresolution processing, wave-equation migration, multiple suppression, amplitude-versus-offset (AVO) or amplitude-versusazimuth (AVAZ) analysis, and time-lapse studies (Trad et al., 2002; Liu and Sacchi, 2004; Abma and Kabir, 2005, 2006; Wang et al., 2010; Naghizadeh and Sacchi, 2010; Li et al., 2012, 2013; Chen et al., 2014a).

In recent years, because of the popularity of compressive sensing (CS) based applications (Cand`es et al., 2006b), there exists a new paradigm for seismic data acquisition that can potentially reduce the survey time and increase the data resolution (Herrmann, 2010). Compressive sensing (CS) is a relatively new paradigm in signal processing that has recently received a lot of attention. The theory indicates that the signal which is sparse under some basis may still be recovered even though the number of measurements is deemed insufficient by Shannon’s criterion. The principle of CS involves solving a least-square minimization problem with a *L*_{1} norm penalty term of the reconstructed model, which requires compromising a least-square data-misfit constraint and a sparsity constraint over the reconstructed model. The iterative shrinkage thresholding (IST) and the projection onto convex sets (POCS) are two common approaches used to solve the minimization problem in the exploration geophysics field.

**Summary**

The synchrosqueezing wavelet transform (SSWT) can capture the local frequency variation in the seismic data with high time and frequency resolutions. In this paper, we propose a novel applications of TF analysis using the SSWT. We propose to use SSWT to detect structural anomalies, such as karstification, which is closely related with oil & gas traps. This application is due to the better separability of different frequency components using SSWT. We divide the total frequency band into different sub-bands and analyze the structural behaviors delineated from the sub-bands. The application of SSWT is compared with that from the widely used S transform (ST), which shows great potential of SSWT to be used as an alternative to traditional TF decomposition approaches. All the studies in this paper are based on the 3D migrated data from the Boonsville field.

**Introduction**

Time-frequency (TF) analysis offers extra information when analyzing seismic data, which has been extensively used in seismic data processing and interpretation. One of the most commonly used applications of TF analysis is that deeper channels are usually stronger at lower frequency and the shallower flank of the channel has stronger amplitudes at higher frequencies (Chen et al., 2014a). In addition, TF analysis can be used in estimating attenuation, pore-pressure prediction, detecting seismic unconformities, and some implementation of seismic chronostratigraphy (Lin et al., 2013). In this paper, we will introduce a novel application of TF analysis to delineate karstification. This novel application is based on a high-resolution TF analysis approach.

Over the past two decades, a bunch of TF analysis approaches have been proposed. The most straightforward approach is the short-time Fourier transform (STFT) (Cohen, 1995), which applies the Fourier transform to small windows with rectangular functions of different positions. Because of the convenient implementation, the STFT has been widely used and has been modified a lot to adapt to wider applications. One of the widely used modification of STFT is the Gabor transform, which applies the Fourier transform to small windows with Gaussian functions of different positions. Both STFT and Gabor transforms suffer from the compromise between time and frequency resolutions because they both use constant window size for applying Fourier transform. A better modification of STFT in order to partially solve the resolution balancing problem is to apply windows with different widths. The width of Gaussian functions is decided according to the frequency components. The basic principle is to apply wider Gaussian functions for low-frequency component, and narrower Gaussian functions for high-frequency component.

annual meeting, application, Artificial Intelligence, Boonsville Field, crossline 75, data quality, decomposition, frequency, frequency slice, geophysics, karstification, machine learning, Reservoir Characterization, resolution, seismic data, SSWT, structural anomaly, TF analysis, TF analysis approach, Upstream Oil & Gas, wavelet, wavelet transform

Technology:

Gan, Shuwei (China University of Petroleum, Beijing) | Wang, Shoudong (China University of Petroleum, Beijing) | Chen, Yangkang (The University of Texas at Austin) | Zhang, Yizhuo (Institut de Physique du Globe de Paris) | Jin, Zhaoyu (University of Edinburgh)

**Summary**

Interpolating regularly missing traces in seismic data is thought to be much harder than interpolating irregularly missing seismic traces, because many sparsity-based approaches can not be used due to the strong aliasing noise in the sparse domain. We propose to use seislet transform to perform a sparsity-based approach to interpolate seismic data with highly under-sampled data based on the classic projection onto convex sets (POCS) framework. Many numerical tests show that the local slope is the main factor that will affect the sparsity and anti-aliasing ability of seislet transform. By low-pass filtering the under-sampled seismic data with a very low bound frequency, we can get a very precise dip estimation, which will make seislet transform capable for interpolating the aliased seismic data. In order to prepare the optimum local slope during iterations, we update the slope field every several iterations. We also use a percentile thresholding approach to better control the reconstruction performance. Both synthetic and field examples show excellent performance using the proposed approach.

**Introduction**

Due to different reasons, seismic data may have missing traces. Seismic data reconstruction is such a procedure to remove sampling artifacts, and to improve amplitude analysis, which are very important for subsequent processing steps including high resolution processing, wave-equation migration, multiple suppression, amplitude-versus-offset (AVO) or amplitude-versusazimuth (AVAZ) analysis, and time-lapse studies (Trad et al., 2002; Liu and Sacchi, 2004; Abma and Kabir, 2005; Wang et al., 2010; Naghizadeh and Sacchi, 2010). In recent years, due to the development of compressive sensing study, there are a lot of sparsity-based methods for interpolating irregularly sampled seismic data. However, for regularly missing traces, sparsity-based methods (Abma and Kabir, 2006; Li et al., 2012, 2013; Chen et al., 2014a) can not obtain satisfied results because of the strong aliasing noise in the transform domain. Instead, the prediction-based approaches (Spitz, 1991; Naghizadeh and Sacchi, 2007) are still the best approaches for interpolating regularly missing traces.

In this paper, we propose to use seislet transform to perform a sparsity-based reconstruction, based on the well-established projection onto convex sets (POCS) framework (Abma and Kabir, 2006). Many numerical studies show that the local slope is the main factor affecting the sparsity and anti-aliasing ability of the seislet transform. Even though with the original aliased data, we can not obtain precise dip estimation, we can use low-pass filtered data (below 15 Hz) to estimated local slope in order to construct the seislet transform of the full-band seismic data and perform thresholding. Synthetic data and field data examples show perfect results using the proposed approach.

Thank you!