Seg Houston 2009 : A Generalized Probabilistic Approach For Processing Seismic Data
SEG HOUSTON 2009 : A GENERALIZED PROBABILISTIC APPROACH FOR PROCESSING SEISMIC DATA
May 15, 2015
Eage 2009 : Sigma Processing Of Vsp Data
EAGE 2009 : SIGMA PROCESSING OF VSP DATA
May 15, 2015
Show all

EAGE ROME 2008 : WHY SHOULD WE USE PROBABILITY MODELS FOR SEISMIC DATA PROCESSING?

Eage Rome 2008 : Why Should We Use Probability Models For Seismic Data Processing?
Why should we use probability models for processing seismic amplitude data? How spatial filters such as factorial kriging compare to their deterministic counter parts? It is not easy to answer such questions and to be understood by operational seismic processing geophysicists. This paper starts from a very basic example taken from a shot point processing that enables to derive a straightforward comparison between standard Wiener amplitude filtering and its geostatistical counterpart, a specific factorial kriging model [3] [4]. A more comprehensive well VSP case is being processed that will also be presented. Comparing both filters from theoretical and practical points of view enables to highlight how they both make use of the same trace autocorrelation function known by geophysicists, but not within the same conceptual framework. It is shown that it is possible to reproduce the geophysical Wiener filters using a factorial kriging probability model and that this type of modelling opens the way to quantification of seismic processing uncertainties.
Eage Rome 2008 : Why Should We Use Probability Models For Seismic Data Processing?